Sep 30 13:39:08 crc systemd[1]: Starting Kubernetes Kubelet... Sep 30 13:39:08 crc restorecon[4641]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:08 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:39:09 crc restorecon[4641]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:39:09 crc restorecon[4641]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Sep 30 13:39:10 crc kubenswrapper[4936]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 13:39:10 crc kubenswrapper[4936]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Sep 30 13:39:10 crc kubenswrapper[4936]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 13:39:10 crc kubenswrapper[4936]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 13:39:10 crc kubenswrapper[4936]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 30 13:39:10 crc kubenswrapper[4936]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.031703 4936 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047325 4936 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047422 4936 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047437 4936 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047452 4936 feature_gate.go:330] unrecognized feature gate: Example Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047463 4936 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047472 4936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047482 4936 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047491 4936 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047501 4936 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047510 4936 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047522 4936 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047533 4936 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047542 4936 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047551 4936 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047559 4936 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047567 4936 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047576 4936 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047584 4936 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047592 4936 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047601 4936 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047609 4936 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047617 4936 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047626 4936 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047634 4936 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047642 4936 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047651 4936 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047659 4936 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047668 4936 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047676 4936 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047684 4936 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047692 4936 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047700 4936 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047709 4936 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047742 4936 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047751 4936 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047760 4936 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047768 4936 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047777 4936 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047788 4936 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047799 4936 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047809 4936 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047820 4936 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047828 4936 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047837 4936 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047846 4936 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047854 4936 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047862 4936 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047870 4936 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047879 4936 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047888 4936 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047896 4936 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047904 4936 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047912 4936 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047921 4936 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047929 4936 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047938 4936 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047946 4936 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047955 4936 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047966 4936 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047975 4936 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047984 4936 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.047992 4936 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.048001 4936 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.048010 4936 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.048021 4936 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.048031 4936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.048043 4936 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.048051 4936 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.048060 4936 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.048069 4936 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.048077 4936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049040 4936 flags.go:64] FLAG: --address="0.0.0.0" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049065 4936 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049078 4936 flags.go:64] FLAG: --anonymous-auth="true" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049086 4936 flags.go:64] FLAG: --application-metrics-count-limit="100" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049094 4936 flags.go:64] FLAG: --authentication-token-webhook="false" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049100 4936 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049108 4936 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049116 4936 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049123 4936 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049129 4936 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049136 4936 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049143 4936 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049150 4936 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049156 4936 flags.go:64] FLAG: --cgroup-root="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049162 4936 flags.go:64] FLAG: --cgroups-per-qos="true" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049168 4936 flags.go:64] FLAG: --client-ca-file="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049173 4936 flags.go:64] FLAG: --cloud-config="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049179 4936 flags.go:64] FLAG: --cloud-provider="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049185 4936 flags.go:64] FLAG: --cluster-dns="[]" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049192 4936 flags.go:64] FLAG: --cluster-domain="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049198 4936 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049204 4936 flags.go:64] FLAG: --config-dir="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049209 4936 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049215 4936 flags.go:64] FLAG: --container-log-max-files="5" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049223 4936 flags.go:64] FLAG: --container-log-max-size="10Mi" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049229 4936 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049234 4936 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049240 4936 flags.go:64] FLAG: --containerd-namespace="k8s.io" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049246 4936 flags.go:64] FLAG: --contention-profiling="false" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049252 4936 flags.go:64] FLAG: --cpu-cfs-quota="true" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049257 4936 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049263 4936 flags.go:64] FLAG: --cpu-manager-policy="none" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049269 4936 flags.go:64] FLAG: --cpu-manager-policy-options="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049276 4936 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049282 4936 flags.go:64] FLAG: --enable-controller-attach-detach="true" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049288 4936 flags.go:64] FLAG: --enable-debugging-handlers="true" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049294 4936 flags.go:64] FLAG: --enable-load-reader="false" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049299 4936 flags.go:64] FLAG: --enable-server="true" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049305 4936 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049314 4936 flags.go:64] FLAG: --event-burst="100" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049319 4936 flags.go:64] FLAG: --event-qps="50" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049325 4936 flags.go:64] FLAG: --event-storage-age-limit="default=0" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049358 4936 flags.go:64] FLAG: --event-storage-event-limit="default=0" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049365 4936 flags.go:64] FLAG: --eviction-hard="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049372 4936 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049378 4936 flags.go:64] FLAG: --eviction-minimum-reclaim="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049383 4936 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049390 4936 flags.go:64] FLAG: --eviction-soft="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049397 4936 flags.go:64] FLAG: --eviction-soft-grace-period="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049403 4936 flags.go:64] FLAG: --exit-on-lock-contention="false" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049409 4936 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049417 4936 flags.go:64] FLAG: --experimental-mounter-path="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049422 4936 flags.go:64] FLAG: --fail-cgroupv1="false" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049428 4936 flags.go:64] FLAG: --fail-swap-on="true" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049434 4936 flags.go:64] FLAG: --feature-gates="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049480 4936 flags.go:64] FLAG: --file-check-frequency="20s" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049487 4936 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049493 4936 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049499 4936 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049505 4936 flags.go:64] FLAG: --healthz-port="10248" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049511 4936 flags.go:64] FLAG: --help="false" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049518 4936 flags.go:64] FLAG: --hostname-override="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049523 4936 flags.go:64] FLAG: --housekeeping-interval="10s" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049529 4936 flags.go:64] FLAG: --http-check-frequency="20s" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049535 4936 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049541 4936 flags.go:64] FLAG: --image-credential-provider-config="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049546 4936 flags.go:64] FLAG: --image-gc-high-threshold="85" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049552 4936 flags.go:64] FLAG: --image-gc-low-threshold="80" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049557 4936 flags.go:64] FLAG: --image-service-endpoint="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049563 4936 flags.go:64] FLAG: --kernel-memcg-notification="false" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049569 4936 flags.go:64] FLAG: --kube-api-burst="100" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049575 4936 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049580 4936 flags.go:64] FLAG: --kube-api-qps="50" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049586 4936 flags.go:64] FLAG: --kube-reserved="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049592 4936 flags.go:64] FLAG: --kube-reserved-cgroup="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049597 4936 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049603 4936 flags.go:64] FLAG: --kubelet-cgroups="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049608 4936 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049614 4936 flags.go:64] FLAG: --lock-file="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049619 4936 flags.go:64] FLAG: --log-cadvisor-usage="false" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049625 4936 flags.go:64] FLAG: --log-flush-frequency="5s" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049631 4936 flags.go:64] FLAG: --log-json-info-buffer-size="0" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049639 4936 flags.go:64] FLAG: --log-json-split-stream="false" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049647 4936 flags.go:64] FLAG: --log-text-info-buffer-size="0" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049652 4936 flags.go:64] FLAG: --log-text-split-stream="false" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049658 4936 flags.go:64] FLAG: --logging-format="text" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049664 4936 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049670 4936 flags.go:64] FLAG: --make-iptables-util-chains="true" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049676 4936 flags.go:64] FLAG: --manifest-url="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049681 4936 flags.go:64] FLAG: --manifest-url-header="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049689 4936 flags.go:64] FLAG: --max-housekeeping-interval="15s" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049694 4936 flags.go:64] FLAG: --max-open-files="1000000" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049701 4936 flags.go:64] FLAG: --max-pods="110" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049707 4936 flags.go:64] FLAG: --maximum-dead-containers="-1" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049713 4936 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049719 4936 flags.go:64] FLAG: --memory-manager-policy="None" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049725 4936 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049731 4936 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049736 4936 flags.go:64] FLAG: --node-ip="192.168.126.11" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049742 4936 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049757 4936 flags.go:64] FLAG: --node-status-max-images="50" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049762 4936 flags.go:64] FLAG: --node-status-update-frequency="10s" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049768 4936 flags.go:64] FLAG: --oom-score-adj="-999" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049774 4936 flags.go:64] FLAG: --pod-cidr="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049780 4936 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049791 4936 flags.go:64] FLAG: --pod-manifest-path="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049796 4936 flags.go:64] FLAG: --pod-max-pids="-1" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049802 4936 flags.go:64] FLAG: --pods-per-core="0" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049808 4936 flags.go:64] FLAG: --port="10250" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049815 4936 flags.go:64] FLAG: --protect-kernel-defaults="false" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049821 4936 flags.go:64] FLAG: --provider-id="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049826 4936 flags.go:64] FLAG: --qos-reserved="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049832 4936 flags.go:64] FLAG: --read-only-port="10255" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049838 4936 flags.go:64] FLAG: --register-node="true" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049845 4936 flags.go:64] FLAG: --register-schedulable="true" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049850 4936 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049861 4936 flags.go:64] FLAG: --registry-burst="10" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049867 4936 flags.go:64] FLAG: --registry-qps="5" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049872 4936 flags.go:64] FLAG: --reserved-cpus="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049879 4936 flags.go:64] FLAG: --reserved-memory="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049907 4936 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049914 4936 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049920 4936 flags.go:64] FLAG: --rotate-certificates="false" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049925 4936 flags.go:64] FLAG: --rotate-server-certificates="false" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049931 4936 flags.go:64] FLAG: --runonce="false" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049937 4936 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049944 4936 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049950 4936 flags.go:64] FLAG: --seccomp-default="false" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049956 4936 flags.go:64] FLAG: --serialize-image-pulls="true" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049961 4936 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049967 4936 flags.go:64] FLAG: --storage-driver-db="cadvisor" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049973 4936 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049979 4936 flags.go:64] FLAG: --storage-driver-password="root" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049985 4936 flags.go:64] FLAG: --storage-driver-secure="false" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049991 4936 flags.go:64] FLAG: --storage-driver-table="stats" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.049996 4936 flags.go:64] FLAG: --storage-driver-user="root" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.050002 4936 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.050008 4936 flags.go:64] FLAG: --sync-frequency="1m0s" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.050014 4936 flags.go:64] FLAG: --system-cgroups="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.050020 4936 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.050031 4936 flags.go:64] FLAG: --system-reserved-cgroup="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.050037 4936 flags.go:64] FLAG: --tls-cert-file="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.050042 4936 flags.go:64] FLAG: --tls-cipher-suites="[]" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.050050 4936 flags.go:64] FLAG: --tls-min-version="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.050055 4936 flags.go:64] FLAG: --tls-private-key-file="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.050061 4936 flags.go:64] FLAG: --topology-manager-policy="none" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.050067 4936 flags.go:64] FLAG: --topology-manager-policy-options="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.050073 4936 flags.go:64] FLAG: --topology-manager-scope="container" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.050079 4936 flags.go:64] FLAG: --v="2" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.050087 4936 flags.go:64] FLAG: --version="false" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.050094 4936 flags.go:64] FLAG: --vmodule="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.050101 4936 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.050107 4936 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050251 4936 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050259 4936 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050265 4936 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050271 4936 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050277 4936 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050283 4936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050288 4936 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050293 4936 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050301 4936 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050307 4936 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050313 4936 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050320 4936 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050325 4936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050358 4936 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050364 4936 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050371 4936 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050378 4936 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050385 4936 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050392 4936 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050398 4936 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050404 4936 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050409 4936 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050416 4936 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050421 4936 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050427 4936 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050432 4936 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050444 4936 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050449 4936 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050454 4936 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050460 4936 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050464 4936 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050470 4936 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050476 4936 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050482 4936 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050488 4936 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050493 4936 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050499 4936 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050504 4936 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050510 4936 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050514 4936 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050520 4936 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050524 4936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050530 4936 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050535 4936 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050540 4936 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050545 4936 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050549 4936 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050554 4936 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050560 4936 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050564 4936 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050569 4936 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050574 4936 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050579 4936 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050585 4936 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050591 4936 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050596 4936 feature_gate.go:330] unrecognized feature gate: Example Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050601 4936 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050607 4936 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050615 4936 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050620 4936 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050626 4936 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050631 4936 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050636 4936 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050642 4936 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050647 4936 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050654 4936 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050660 4936 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050666 4936 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050672 4936 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050677 4936 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.050683 4936 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.050692 4936 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.059980 4936 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.060031 4936 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060187 4936 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060206 4936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060215 4936 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060223 4936 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060231 4936 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060239 4936 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060246 4936 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060253 4936 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060260 4936 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060270 4936 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060282 4936 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060290 4936 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060300 4936 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060309 4936 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060317 4936 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060325 4936 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060356 4936 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060362 4936 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060393 4936 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060400 4936 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060405 4936 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060411 4936 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060417 4936 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060422 4936 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060427 4936 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060432 4936 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060437 4936 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060443 4936 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060448 4936 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060455 4936 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060461 4936 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060469 4936 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060475 4936 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060517 4936 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060524 4936 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060531 4936 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060537 4936 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060544 4936 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060551 4936 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060557 4936 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060564 4936 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060571 4936 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060581 4936 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060590 4936 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060600 4936 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060607 4936 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060615 4936 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060621 4936 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060627 4936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060633 4936 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060638 4936 feature_gate.go:330] unrecognized feature gate: Example Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060644 4936 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060650 4936 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060656 4936 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060661 4936 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060666 4936 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060671 4936 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060677 4936 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060682 4936 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060688 4936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060693 4936 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060698 4936 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060704 4936 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060709 4936 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060715 4936 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060723 4936 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060728 4936 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060733 4936 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060738 4936 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060743 4936 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060748 4936 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.060759 4936 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060939 4936 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060949 4936 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060955 4936 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060961 4936 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060969 4936 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060975 4936 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060982 4936 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060989 4936 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.060994 4936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061001 4936 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061008 4936 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061014 4936 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061020 4936 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061026 4936 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061032 4936 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061038 4936 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061043 4936 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061048 4936 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061053 4936 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061058 4936 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061063 4936 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061068 4936 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061074 4936 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061079 4936 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061085 4936 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061090 4936 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061096 4936 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061101 4936 feature_gate.go:330] unrecognized feature gate: Example Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061107 4936 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061113 4936 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061118 4936 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061123 4936 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061129 4936 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061134 4936 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061140 4936 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061145 4936 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061151 4936 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061157 4936 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061163 4936 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061168 4936 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061173 4936 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061178 4936 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061183 4936 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061189 4936 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061194 4936 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061199 4936 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061206 4936 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061213 4936 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061219 4936 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061226 4936 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061233 4936 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061240 4936 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061247 4936 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061254 4936 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061261 4936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061268 4936 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061277 4936 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061283 4936 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061289 4936 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061295 4936 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061300 4936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061305 4936 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061313 4936 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061319 4936 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061325 4936 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061353 4936 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061359 4936 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061367 4936 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061374 4936 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061380 4936 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.061386 4936 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.061396 4936 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.061701 4936 server.go:940] "Client rotation is on, will bootstrap in background" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.067070 4936 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.067180 4936 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.069308 4936 server.go:997] "Starting client certificate rotation" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.069331 4936 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.070484 4936 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-20 09:09:12.2974175 +0000 UTC Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.070619 4936 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1939h30m2.226806425s for next certificate rotation Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.111327 4936 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.114766 4936 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.133384 4936 log.go:25] "Validated CRI v1 runtime API" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.175461 4936 log.go:25] "Validated CRI v1 image API" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.177807 4936 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.184479 4936 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-09-30-12-18-23-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.184520 4936 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.206598 4936 manager.go:217] Machine: {Timestamp:2025-09-30 13:39:10.203954287 +0000 UTC m=+0.587956628 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2800000 MemoryCapacity:25199480832 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:db7ffece-d862-468a-996c-f544c38024fc BootID:8be31134-e63d-454e-b952-15f6f996f2b7 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599742464 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:f6:d0:37 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:f6:d0:37 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:06:76:92 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:77:68:52 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:cc:cf:bf Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:a6:4c:8e Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:34:e8:22 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:02:79:35:ff:15:65 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ae:c7:85:4e:97:4a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199480832 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.206856 4936 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.207044 4936 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.207446 4936 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.207679 4936 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.207726 4936 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.207984 4936 topology_manager.go:138] "Creating topology manager with none policy" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.207998 4936 container_manager_linux.go:303] "Creating device plugin manager" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.208556 4936 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.208599 4936 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.209166 4936 state_mem.go:36] "Initialized new in-memory state store" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.209251 4936 server.go:1245] "Using root directory" path="/var/lib/kubelet" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.213619 4936 kubelet.go:418] "Attempting to sync node with API server" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.213651 4936 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.213671 4936 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.213686 4936 kubelet.go:324] "Adding apiserver pod source" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.213707 4936 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.218930 4936 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.220036 4936 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.222085 4936 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.222923 4936 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.64:6443: connect: connection refused Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.222937 4936 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.64:6443: connect: connection refused Sep 30 13:39:10 crc kubenswrapper[4936]: E0930 13:39:10.223085 4936 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.64:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:39:10 crc kubenswrapper[4936]: E0930 13:39:10.223097 4936 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.64:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.223646 4936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.223730 4936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.223789 4936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.223838 4936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.223889 4936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.223942 4936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.223993 4936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.224048 4936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.224108 4936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.224163 4936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.224257 4936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.224320 4936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.225143 4936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.225705 4936 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.64:6443: connect: connection refused Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.225888 4936 server.go:1280] "Started kubelet" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.226108 4936 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.226534 4936 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.226966 4936 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.228311 4936 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.228384 4936 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.228639 4936 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 04:50:27.079405307 +0000 UTC Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.228686 4936 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 2271h11m16.850723507s for next certificate rotation Sep 30 13:39:10 crc kubenswrapper[4936]: E0930 13:39:10.228939 4936 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.229144 4936 volume_manager.go:287] "The desired_state_of_world populator starts" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.229160 4936 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.229277 4936 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Sep 30 13:39:10 crc systemd[1]: Started Kubernetes Kubelet. Sep 30 13:39:10 crc kubenswrapper[4936]: E0930 13:39:10.230153 4936 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.64:6443: connect: connection refused" interval="200ms" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.230207 4936 server.go:460] "Adding debug handlers to kubelet server" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.230985 4936 factory.go:153] Registering CRI-O factory Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.232096 4936 factory.go:221] Registration of the crio container factory successfully Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.232224 4936 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.232251 4936 factory.go:55] Registering systemd factory Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.232267 4936 factory.go:221] Registration of the systemd container factory successfully Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.232308 4936 factory.go:103] Registering Raw factory Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.232364 4936 manager.go:1196] Started watching for new ooms in manager Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.236269 4936 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.64:6443: connect: connection refused Sep 30 13:39:10 crc kubenswrapper[4936]: E0930 13:39:10.236420 4936 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.64:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.249929 4936 manager.go:319] Starting recovery of all containers Sep 30 13:39:10 crc kubenswrapper[4936]: E0930 13:39:10.243237 4936 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.64:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a130cd207a24e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-30 13:39:10.225850958 +0000 UTC m=+0.609853259,LastTimestamp:2025-09-30 13:39:10.225850958 +0000 UTC m=+0.609853259,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.258991 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.259946 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.260031 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.260109 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.260171 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.260226 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.262475 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.262570 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.262826 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.262897 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.262957 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.263016 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.263498 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.263610 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.263685 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.263764 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.263848 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.264774 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.264855 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.264882 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.264905 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.264929 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.264953 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.264976 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265004 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265027 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265106 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265133 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265160 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265183 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265206 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265231 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265256 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265279 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265304 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265375 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265400 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265423 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265454 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265477 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265500 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265523 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265544 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265570 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265592 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265656 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265680 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265702 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265728 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265751 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265773 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265796 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265826 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265853 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265878 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265902 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265929 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265952 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.265977 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.266002 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.266024 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.266048 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.266133 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.266157 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.266182 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.266205 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.266228 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.266251 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.266275 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.266300 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.266324 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.266373 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.266397 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.266924 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.266960 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.266983 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267006 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267027 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267048 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267069 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267093 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267116 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267137 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267158 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267179 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267199 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267223 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267244 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267264 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267284 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267304 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267324 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267432 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267455 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267477 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267498 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267521 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267541 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267563 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267584 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267605 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267626 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267646 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267668 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267695 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267719 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267745 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267768 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267790 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267810 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267841 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267864 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267895 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267917 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267938 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267960 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267979 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.267999 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268019 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268040 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268061 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268080 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268099 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268120 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268138 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268158 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268177 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268198 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268220 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268239 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268261 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268282 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268301 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268321 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268375 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268396 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268417 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268436 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268455 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268474 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268494 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268512 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268532 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268553 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268574 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268593 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268615 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268636 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268658 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268676 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268695 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268713 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268733 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268754 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268773 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268792 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268811 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268832 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268853 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268876 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268897 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268916 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268936 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268956 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268979 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.268999 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269019 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269039 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269061 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269082 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269101 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269121 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269140 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269160 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269179 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269199 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269220 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269239 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269260 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269279 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269298 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269388 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269412 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269431 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269454 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269474 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269493 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269513 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269532 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269551 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269664 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269683 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.269707 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.271950 4936 manager.go:324] Recovery completed Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.272621 4936 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.272672 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.272696 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.272719 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.272739 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.272760 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.272783 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.272804 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.272825 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.272846 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.272866 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.272885 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.272904 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.272925 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.272944 4936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.272965 4936 reconstruct.go:97] "Volume reconstruction finished" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.272981 4936 reconciler.go:26] "Reconciler: start to sync state" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.283421 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.286622 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.286696 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.286709 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.287791 4936 cpu_manager.go:225] "Starting CPU manager" policy="none" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.287809 4936 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.287834 4936 state_mem.go:36] "Initialized new in-memory state store" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.312601 4936 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.314018 4936 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.314070 4936 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.314096 4936 kubelet.go:2335] "Starting kubelet main sync loop" Sep 30 13:39:10 crc kubenswrapper[4936]: E0930 13:39:10.314200 4936 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.316758 4936 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.64:6443: connect: connection refused Sep 30 13:39:10 crc kubenswrapper[4936]: E0930 13:39:10.316853 4936 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.64:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.317802 4936 policy_none.go:49] "None policy: Start" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.318671 4936 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.318714 4936 state_mem.go:35] "Initializing new in-memory state store" Sep 30 13:39:10 crc kubenswrapper[4936]: E0930 13:39:10.329773 4936 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.370893 4936 manager.go:334] "Starting Device Plugin manager" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.370944 4936 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.370971 4936 server.go:79] "Starting device plugin registration server" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.371435 4936 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.371689 4936 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.372117 4936 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.372250 4936 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.372260 4936 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 30 13:39:10 crc kubenswrapper[4936]: E0930 13:39:10.380643 4936 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.414949 4936 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.415056 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.416498 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.416548 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.416559 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.416704 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.416972 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.417007 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.417747 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.417771 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.417786 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.417921 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.417998 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.418011 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.418322 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.418432 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.418474 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.419295 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.419327 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.419350 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.419652 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.419701 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.419712 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.419858 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.419991 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.420031 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.420790 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.420815 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.420825 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.421663 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.421708 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.421721 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.421895 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.422081 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.422136 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.424132 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.424166 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.424178 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.424323 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.424364 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.424372 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.424397 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.424377 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.427533 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.427580 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.427599 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:10 crc kubenswrapper[4936]: E0930 13:39:10.430900 4936 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.64:6443: connect: connection refused" interval="400ms" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.472448 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.474143 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.474193 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.474209 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.474242 4936 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 13:39:10 crc kubenswrapper[4936]: E0930 13:39:10.475349 4936 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.64:6443: connect: connection refused" node="crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.475468 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.475527 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.475556 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.475617 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.475661 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.475684 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.475710 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.475728 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.475747 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.475762 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.475799 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.475816 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.475832 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.475847 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.475861 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.576986 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577235 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577443 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577552 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577576 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577591 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577607 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577623 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577639 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577652 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577665 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577673 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577680 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577718 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577718 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577733 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577743 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577751 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577773 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577701 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577756 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577797 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577808 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577808 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577826 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577835 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577834 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577851 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.577864 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.578008 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.676028 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.678871 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.678925 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.678936 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.678979 4936 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 13:39:10 crc kubenswrapper[4936]: E0930 13:39:10.679481 4936 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.64:6443: connect: connection refused" node="crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.761090 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.778743 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.801430 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.822098 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.822615 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-cd6c97d66d048123a779c2f5bdffe4bdb43d77575e1e42a4da48795f71b03f1c WatchSource:0}: Error finding container cd6c97d66d048123a779c2f5bdffe4bdb43d77575e1e42a4da48795f71b03f1c: Status 404 returned error can't find the container with id cd6c97d66d048123a779c2f5bdffe4bdb43d77575e1e42a4da48795f71b03f1c Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.827476 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-012196b5d8f5b8a0643e33685a7bc6ac2c6e3bc22690521c860519f200375de6 WatchSource:0}: Error finding container 012196b5d8f5b8a0643e33685a7bc6ac2c6e3bc22690521c860519f200375de6: Status 404 returned error can't find the container with id 012196b5d8f5b8a0643e33685a7bc6ac2c6e3bc22690521c860519f200375de6 Sep 30 13:39:10 crc kubenswrapper[4936]: I0930 13:39:10.828719 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:39:10 crc kubenswrapper[4936]: E0930 13:39:10.832085 4936 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.64:6443: connect: connection refused" interval="800ms" Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.841872 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-c1517276849c85de4e59ae352f25978b48059ab9bcffa30c9d70c9c1419631b5 WatchSource:0}: Error finding container c1517276849c85de4e59ae352f25978b48059ab9bcffa30c9d70c9c1419631b5: Status 404 returned error can't find the container with id c1517276849c85de4e59ae352f25978b48059ab9bcffa30c9d70c9c1419631b5 Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.843491 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-3331b9aad30f517cac45f087e95cd37a81ef48f8487fa81f3d1089ced248bb8a WatchSource:0}: Error finding container 3331b9aad30f517cac45f087e95cd37a81ef48f8487fa81f3d1089ced248bb8a: Status 404 returned error can't find the container with id 3331b9aad30f517cac45f087e95cd37a81ef48f8487fa81f3d1089ced248bb8a Sep 30 13:39:10 crc kubenswrapper[4936]: W0930 13:39:10.852188 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-33944dddee524ea1ebefb1092ba77e50322151250d5a02a8860a6d18c6622b36 WatchSource:0}: Error finding container 33944dddee524ea1ebefb1092ba77e50322151250d5a02a8860a6d18c6622b36: Status 404 returned error can't find the container with id 33944dddee524ea1ebefb1092ba77e50322151250d5a02a8860a6d18c6622b36 Sep 30 13:39:11 crc kubenswrapper[4936]: I0930 13:39:11.079814 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:11 crc kubenswrapper[4936]: I0930 13:39:11.081016 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:11 crc kubenswrapper[4936]: I0930 13:39:11.081066 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:11 crc kubenswrapper[4936]: I0930 13:39:11.081076 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:11 crc kubenswrapper[4936]: I0930 13:39:11.081096 4936 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 13:39:11 crc kubenswrapper[4936]: E0930 13:39:11.081449 4936 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.64:6443: connect: connection refused" node="crc" Sep 30 13:39:11 crc kubenswrapper[4936]: I0930 13:39:11.227213 4936 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.64:6443: connect: connection refused Sep 30 13:39:11 crc kubenswrapper[4936]: W0930 13:39:11.248596 4936 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.64:6443: connect: connection refused Sep 30 13:39:11 crc kubenswrapper[4936]: E0930 13:39:11.248682 4936 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.64:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:39:11 crc kubenswrapper[4936]: I0930 13:39:11.318739 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c1517276849c85de4e59ae352f25978b48059ab9bcffa30c9d70c9c1419631b5"} Sep 30 13:39:11 crc kubenswrapper[4936]: I0930 13:39:11.320953 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"012196b5d8f5b8a0643e33685a7bc6ac2c6e3bc22690521c860519f200375de6"} Sep 30 13:39:11 crc kubenswrapper[4936]: I0930 13:39:11.322010 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cd6c97d66d048123a779c2f5bdffe4bdb43d77575e1e42a4da48795f71b03f1c"} Sep 30 13:39:11 crc kubenswrapper[4936]: I0930 13:39:11.324921 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"33944dddee524ea1ebefb1092ba77e50322151250d5a02a8860a6d18c6622b36"} Sep 30 13:39:11 crc kubenswrapper[4936]: I0930 13:39:11.326384 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3331b9aad30f517cac45f087e95cd37a81ef48f8487fa81f3d1089ced248bb8a"} Sep 30 13:39:11 crc kubenswrapper[4936]: W0930 13:39:11.489048 4936 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.64:6443: connect: connection refused Sep 30 13:39:11 crc kubenswrapper[4936]: E0930 13:39:11.489477 4936 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.64:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:39:11 crc kubenswrapper[4936]: E0930 13:39:11.632606 4936 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.64:6443: connect: connection refused" interval="1.6s" Sep 30 13:39:11 crc kubenswrapper[4936]: W0930 13:39:11.733783 4936 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.64:6443: connect: connection refused Sep 30 13:39:11 crc kubenswrapper[4936]: E0930 13:39:11.733850 4936 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.64:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:39:11 crc kubenswrapper[4936]: W0930 13:39:11.829139 4936 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.64:6443: connect: connection refused Sep 30 13:39:11 crc kubenswrapper[4936]: E0930 13:39:11.829204 4936 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.64:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:39:11 crc kubenswrapper[4936]: I0930 13:39:11.882033 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:11 crc kubenswrapper[4936]: I0930 13:39:11.883462 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:11 crc kubenswrapper[4936]: I0930 13:39:11.883519 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:11 crc kubenswrapper[4936]: I0930 13:39:11.883531 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:11 crc kubenswrapper[4936]: I0930 13:39:11.883564 4936 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 13:39:11 crc kubenswrapper[4936]: E0930 13:39:11.884144 4936 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.64:6443: connect: connection refused" node="crc" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.227405 4936 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.64:6443: connect: connection refused Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.332293 4936 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ddd89827b85338f2e1555b12d4fd87323ab5dd62b690f2fb36753e521131cb42" exitCode=0 Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.332367 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ddd89827b85338f2e1555b12d4fd87323ab5dd62b690f2fb36753e521131cb42"} Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.332444 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.333423 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.333456 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.333468 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.334817 4936 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="36f44aca3adbafbea2bb815782511bea9e78b4524a31d0b32749d66eda666c08" exitCode=0 Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.334881 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"36f44aca3adbafbea2bb815782511bea9e78b4524a31d0b32749d66eda666c08"} Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.334925 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.335913 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.335955 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.335969 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.339108 4936 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac" exitCode=0 Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.339362 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.339432 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac"} Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.340574 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.340618 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.340631 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.348718 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150"} Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.348780 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0"} Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.348791 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f"} Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.348806 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e"} Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.348939 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.354979 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.355030 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.355109 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.357439 4936 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5" exitCode=0 Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.357484 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5"} Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.357604 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.360432 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.360469 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.360480 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.364209 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.366663 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.366705 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:12 crc kubenswrapper[4936]: I0930 13:39:12.366720 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.227180 4936 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.64:6443: connect: connection refused Sep 30 13:39:13 crc kubenswrapper[4936]: E0930 13:39:13.233891 4936 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.64:6443: connect: connection refused" interval="3.2s" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.363751 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"747d0723207eb1c09444a9cd9f8f52b45b0851c975dfbc81413aafaaa4469fca"} Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.363794 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.363806 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a08e2b652bd5ea66b08b8186e9e4c204d9bb24e98d561f8410614dcfeaebaac4"} Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.364125 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"09c4ee9d83ec4799fb66b7b47123c77b9dae4dbbd5f06bfda032567297e0939c"} Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.364551 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.364574 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.364584 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.380282 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb"} Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.380348 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1"} Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.380362 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31"} Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.380371 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829"} Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.384466 4936 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d8a7197cef2ee198fbb5ad5cea725bf0171b8ff2ea8e2f7d3a398b7bf630986b" exitCode=0 Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.384553 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d8a7197cef2ee198fbb5ad5cea725bf0171b8ff2ea8e2f7d3a398b7bf630986b"} Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.384592 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.385730 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.385761 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.385773 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.387804 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.387903 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.388138 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0cfca0b1bd5481bea13af586db630f1141ff976d3da01212a59f178d8293bc1b"} Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.388557 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.388574 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.388583 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.388953 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.388975 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.388985 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.485125 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.486433 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.486472 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.486484 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.486520 4936 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 13:39:13 crc kubenswrapper[4936]: E0930 13:39:13.486943 4936 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.64:6443: connect: connection refused" node="crc" Sep 30 13:39:13 crc kubenswrapper[4936]: W0930 13:39:13.620247 4936 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.64:6443: connect: connection refused Sep 30 13:39:13 crc kubenswrapper[4936]: E0930 13:39:13.620318 4936 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.64:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:39:13 crc kubenswrapper[4936]: I0930 13:39:13.728853 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:39:13 crc kubenswrapper[4936]: W0930 13:39:13.981746 4936 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.64:6443: connect: connection refused Sep 30 13:39:13 crc kubenswrapper[4936]: E0930 13:39:13.981815 4936 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.64:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:39:14 crc kubenswrapper[4936]: W0930 13:39:14.199353 4936 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.64:6443: connect: connection refused Sep 30 13:39:14 crc kubenswrapper[4936]: E0930 13:39:14.199418 4936 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.64:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.226750 4936 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.64:6443: connect: connection refused Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.400908 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ead175d706f67b3b024c6205f7558205ed4b209953fac12acb50b921ee784fc6"} Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.401068 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.402184 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.402223 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.402235 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.404157 4936 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fee4d40a61fe8d2c1c53e3c99cb0fe8902aecf0c55a643aed7b9cde2b4678948" exitCode=0 Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.404259 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.404264 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fee4d40a61fe8d2c1c53e3c99cb0fe8902aecf0c55a643aed7b9cde2b4678948"} Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.404321 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.404348 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.404268 4936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.404525 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.405592 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.405619 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.405629 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.406057 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.406068 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.406076 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.406422 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.406440 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.406447 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.406800 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.406818 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.406825 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:14 crc kubenswrapper[4936]: I0930 13:39:14.484944 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:39:15 crc kubenswrapper[4936]: I0930 13:39:15.412912 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:15 crc kubenswrapper[4936]: I0930 13:39:15.412952 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"34790420f2fbc84f8e127a0d53172ba0b739fe72c02abc09b18cdb3780469ee7"} Sep 30 13:39:15 crc kubenswrapper[4936]: I0930 13:39:15.412995 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0c8e08c20275c5eb8fb4bd360ab1115014050fe2bd9b8dfaac407f8ab9aa115c"} Sep 30 13:39:15 crc kubenswrapper[4936]: I0930 13:39:15.413016 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"44ff5042a7d61e0e4577b29599f839c2daf706edebfb291251517082ad13413a"} Sep 30 13:39:15 crc kubenswrapper[4936]: I0930 13:39:15.413032 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1f9a4b004a24ae6839d938f2bd246598c885fc558bfba73f45e9788aedd6a348"} Sep 30 13:39:15 crc kubenswrapper[4936]: I0930 13:39:15.413047 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f5b646312f80abd4a3f21dcd0ae1484aad140291ffc1c18f7e22a8e5ac0e013a"} Sep 30 13:39:15 crc kubenswrapper[4936]: I0930 13:39:15.413127 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:15 crc kubenswrapper[4936]: I0930 13:39:15.413178 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:39:15 crc kubenswrapper[4936]: I0930 13:39:15.413402 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:15 crc kubenswrapper[4936]: I0930 13:39:15.414028 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:15 crc kubenswrapper[4936]: I0930 13:39:15.414053 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:15 crc kubenswrapper[4936]: I0930 13:39:15.414064 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:15 crc kubenswrapper[4936]: I0930 13:39:15.414136 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:15 crc kubenswrapper[4936]: I0930 13:39:15.414159 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:15 crc kubenswrapper[4936]: I0930 13:39:15.414174 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:15 crc kubenswrapper[4936]: I0930 13:39:15.415557 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:15 crc kubenswrapper[4936]: I0930 13:39:15.415594 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:15 crc kubenswrapper[4936]: I0930 13:39:15.415611 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:16 crc kubenswrapper[4936]: I0930 13:39:16.414830 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:16 crc kubenswrapper[4936]: I0930 13:39:16.414898 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:16 crc kubenswrapper[4936]: I0930 13:39:16.416437 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:16 crc kubenswrapper[4936]: I0930 13:39:16.416468 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:16 crc kubenswrapper[4936]: I0930 13:39:16.416477 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:16 crc kubenswrapper[4936]: I0930 13:39:16.417166 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:16 crc kubenswrapper[4936]: I0930 13:39:16.417189 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:16 crc kubenswrapper[4936]: I0930 13:39:16.417198 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:16 crc kubenswrapper[4936]: I0930 13:39:16.687123 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:16 crc kubenswrapper[4936]: I0930 13:39:16.688577 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:16 crc kubenswrapper[4936]: I0930 13:39:16.688612 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:16 crc kubenswrapper[4936]: I0930 13:39:16.688624 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:16 crc kubenswrapper[4936]: I0930 13:39:16.688647 4936 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 13:39:17 crc kubenswrapper[4936]: I0930 13:39:17.130526 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:39:17 crc kubenswrapper[4936]: I0930 13:39:17.133131 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:39:17 crc kubenswrapper[4936]: I0930 13:39:17.133382 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:17 crc kubenswrapper[4936]: I0930 13:39:17.134887 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:17 crc kubenswrapper[4936]: I0930 13:39:17.134929 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:17 crc kubenswrapper[4936]: I0930 13:39:17.134946 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:17 crc kubenswrapper[4936]: I0930 13:39:17.199918 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Sep 30 13:39:17 crc kubenswrapper[4936]: I0930 13:39:17.418408 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:17 crc kubenswrapper[4936]: I0930 13:39:17.418460 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:17 crc kubenswrapper[4936]: I0930 13:39:17.420247 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:17 crc kubenswrapper[4936]: I0930 13:39:17.420299 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:17 crc kubenswrapper[4936]: I0930 13:39:17.420321 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:17 crc kubenswrapper[4936]: I0930 13:39:17.420651 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:17 crc kubenswrapper[4936]: I0930 13:39:17.420731 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:17 crc kubenswrapper[4936]: I0930 13:39:17.420770 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:17 crc kubenswrapper[4936]: I0930 13:39:17.485642 4936 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 13:39:17 crc kubenswrapper[4936]: I0930 13:39:17.485740 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 13:39:19 crc kubenswrapper[4936]: I0930 13:39:19.238888 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:39:19 crc kubenswrapper[4936]: I0930 13:39:19.239082 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:19 crc kubenswrapper[4936]: I0930 13:39:19.240360 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:19 crc kubenswrapper[4936]: I0930 13:39:19.240428 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:19 crc kubenswrapper[4936]: I0930 13:39:19.240437 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:19 crc kubenswrapper[4936]: I0930 13:39:19.631558 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:39:19 crc kubenswrapper[4936]: I0930 13:39:19.631748 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:19 crc kubenswrapper[4936]: I0930 13:39:19.633282 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:19 crc kubenswrapper[4936]: I0930 13:39:19.633397 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:19 crc kubenswrapper[4936]: I0930 13:39:19.633426 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:19 crc kubenswrapper[4936]: I0930 13:39:19.672025 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:39:20 crc kubenswrapper[4936]: I0930 13:39:20.365581 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:39:20 crc kubenswrapper[4936]: I0930 13:39:20.365761 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:20 crc kubenswrapper[4936]: I0930 13:39:20.366924 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:20 crc kubenswrapper[4936]: I0930 13:39:20.366960 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:20 crc kubenswrapper[4936]: I0930 13:39:20.366976 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:20 crc kubenswrapper[4936]: E0930 13:39:20.380790 4936 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 13:39:20 crc kubenswrapper[4936]: I0930 13:39:20.424266 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:20 crc kubenswrapper[4936]: I0930 13:39:20.425561 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:20 crc kubenswrapper[4936]: I0930 13:39:20.425621 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:20 crc kubenswrapper[4936]: I0930 13:39:20.425637 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:24 crc kubenswrapper[4936]: I0930 13:39:24.435729 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 13:39:24 crc kubenswrapper[4936]: I0930 13:39:24.438042 4936 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ead175d706f67b3b024c6205f7558205ed4b209953fac12acb50b921ee784fc6" exitCode=255 Sep 30 13:39:24 crc kubenswrapper[4936]: I0930 13:39:24.438093 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ead175d706f67b3b024c6205f7558205ed4b209953fac12acb50b921ee784fc6"} Sep 30 13:39:24 crc kubenswrapper[4936]: I0930 13:39:24.438254 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:24 crc kubenswrapper[4936]: I0930 13:39:24.439043 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:24 crc kubenswrapper[4936]: I0930 13:39:24.439102 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:24 crc kubenswrapper[4936]: I0930 13:39:24.439111 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:24 crc kubenswrapper[4936]: I0930 13:39:24.439613 4936 scope.go:117] "RemoveContainer" containerID="ead175d706f67b3b024c6205f7558205ed4b209953fac12acb50b921ee784fc6" Sep 30 13:39:24 crc kubenswrapper[4936]: I0930 13:39:24.683528 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Sep 30 13:39:24 crc kubenswrapper[4936]: I0930 13:39:24.683691 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:24 crc kubenswrapper[4936]: I0930 13:39:24.684966 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:24 crc kubenswrapper[4936]: I0930 13:39:24.685011 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:24 crc kubenswrapper[4936]: I0930 13:39:24.685025 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:24 crc kubenswrapper[4936]: I0930 13:39:24.726385 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Sep 30 13:39:24 crc kubenswrapper[4936]: W0930 13:39:24.758553 4936 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 30 13:39:24 crc kubenswrapper[4936]: I0930 13:39:24.758653 4936 trace.go:236] Trace[1322429555]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 13:39:14.757) (total time: 10001ms): Sep 30 13:39:24 crc kubenswrapper[4936]: Trace[1322429555]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:39:24.758) Sep 30 13:39:24 crc kubenswrapper[4936]: Trace[1322429555]: [10.001397105s] [10.001397105s] END Sep 30 13:39:24 crc kubenswrapper[4936]: E0930 13:39:24.758682 4936 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 30 13:39:24 crc kubenswrapper[4936]: I0930 13:39:24.927584 4936 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 13:39:24 crc kubenswrapper[4936]: I0930 13:39:24.927639 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 13:39:24 crc kubenswrapper[4936]: I0930 13:39:24.934069 4936 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 13:39:24 crc kubenswrapper[4936]: I0930 13:39:24.934106 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 13:39:25 crc kubenswrapper[4936]: I0930 13:39:25.441702 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 13:39:25 crc kubenswrapper[4936]: I0930 13:39:25.443588 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed"} Sep 30 13:39:25 crc kubenswrapper[4936]: I0930 13:39:25.443676 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:25 crc kubenswrapper[4936]: I0930 13:39:25.443748 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:25 crc kubenswrapper[4936]: I0930 13:39:25.444483 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:25 crc kubenswrapper[4936]: I0930 13:39:25.444509 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:25 crc kubenswrapper[4936]: I0930 13:39:25.444521 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:25 crc kubenswrapper[4936]: I0930 13:39:25.444479 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:25 crc kubenswrapper[4936]: I0930 13:39:25.444551 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:25 crc kubenswrapper[4936]: I0930 13:39:25.444565 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:25 crc kubenswrapper[4936]: I0930 13:39:25.458100 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Sep 30 13:39:26 crc kubenswrapper[4936]: I0930 13:39:26.445262 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:26 crc kubenswrapper[4936]: I0930 13:39:26.446212 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:26 crc kubenswrapper[4936]: I0930 13:39:26.446254 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:26 crc kubenswrapper[4936]: I0930 13:39:26.446266 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:27 crc kubenswrapper[4936]: I0930 13:39:27.133484 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:39:27 crc kubenswrapper[4936]: I0930 13:39:27.133609 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:27 crc kubenswrapper[4936]: I0930 13:39:27.133879 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:39:27 crc kubenswrapper[4936]: I0930 13:39:27.134433 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:27 crc kubenswrapper[4936]: I0930 13:39:27.134462 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:27 crc kubenswrapper[4936]: I0930 13:39:27.134474 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:27 crc kubenswrapper[4936]: I0930 13:39:27.136476 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:39:27 crc kubenswrapper[4936]: I0930 13:39:27.136650 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:27 crc kubenswrapper[4936]: I0930 13:39:27.137567 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:27 crc kubenswrapper[4936]: I0930 13:39:27.137593 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:27 crc kubenswrapper[4936]: I0930 13:39:27.137604 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:27 crc kubenswrapper[4936]: I0930 13:39:27.138382 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:39:27 crc kubenswrapper[4936]: I0930 13:39:27.447291 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:27 crc kubenswrapper[4936]: I0930 13:39:27.448378 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:27 crc kubenswrapper[4936]: I0930 13:39:27.448425 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:27 crc kubenswrapper[4936]: I0930 13:39:27.448437 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:27 crc kubenswrapper[4936]: I0930 13:39:27.486492 4936 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 13:39:27 crc kubenswrapper[4936]: I0930 13:39:27.486563 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 13:39:28 crc kubenswrapper[4936]: I0930 13:39:28.449294 4936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:39:28 crc kubenswrapper[4936]: I0930 13:39:28.450122 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:28 crc kubenswrapper[4936]: I0930 13:39:28.450152 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:28 crc kubenswrapper[4936]: I0930 13:39:28.450161 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:29 crc kubenswrapper[4936]: E0930 13:39:29.921183 4936 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.922788 4936 trace.go:236] Trace[706675626]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 13:39:18.873) (total time: 11049ms): Sep 30 13:39:29 crc kubenswrapper[4936]: Trace[706675626]: ---"Objects listed" error: 11049ms (13:39:29.922) Sep 30 13:39:29 crc kubenswrapper[4936]: Trace[706675626]: [11.04913153s] [11.04913153s] END Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.922827 4936 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.923645 4936 trace.go:236] Trace[1897150]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 13:39:17.523) (total time: 12399ms): Sep 30 13:39:29 crc kubenswrapper[4936]: Trace[1897150]: ---"Objects listed" error: 12399ms (13:39:29.923) Sep 30 13:39:29 crc kubenswrapper[4936]: Trace[1897150]: [12.399875235s] [12.399875235s] END Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.923670 4936 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.930469 4936 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.931632 4936 trace.go:236] Trace[990207933]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 13:39:19.360) (total time: 10570ms): Sep 30 13:39:29 crc kubenswrapper[4936]: Trace[990207933]: ---"Objects listed" error: 10570ms (13:39:29.931) Sep 30 13:39:29 crc kubenswrapper[4936]: Trace[990207933]: [10.57064235s] [10.57064235s] END Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.931660 4936 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.935373 4936 kubelet_node_status.go:115] "Node was previously registered" node="crc" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.935457 4936 kubelet_node_status.go:79] "Successfully registered node" node="crc" Sep 30 13:39:29 crc kubenswrapper[4936]: E0930 13:39:29.935475 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.940015 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.940216 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.940343 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.940442 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.940528 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:29Z","lastTransitionTime":"2025-09-30T13:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:29 crc kubenswrapper[4936]: E0930 13:39:29.950788 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.954483 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.954701 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.954783 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.954928 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.955017 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:29Z","lastTransitionTime":"2025-09-30T13:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:29 crc kubenswrapper[4936]: E0930 13:39:29.965215 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.968270 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.968441 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.968527 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.968610 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.968690 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:29Z","lastTransitionTime":"2025-09-30T13:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:29 crc kubenswrapper[4936]: E0930 13:39:29.978714 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.982977 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.983010 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.983019 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.983035 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:29 crc kubenswrapper[4936]: I0930 13:39:29.983045 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:29Z","lastTransitionTime":"2025-09-30T13:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:29 crc kubenswrapper[4936]: E0930 13:39:29.994928 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:29 crc kubenswrapper[4936]: E0930 13:39:29.995045 4936 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:39:29 crc kubenswrapper[4936]: E0930 13:39:29.995072 4936 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 13:39:30 crc kubenswrapper[4936]: E0930 13:39:30.095540 4936 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 13:39:30 crc kubenswrapper[4936]: E0930 13:39:30.196398 4936 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 13:39:30 crc kubenswrapper[4936]: E0930 13:39:30.296837 4936 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.352362 4936 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.398967 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.399007 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.399018 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.399050 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.399062 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:30Z","lastTransitionTime":"2025-09-30T13:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.454907 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.455419 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.457106 4936 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed" exitCode=255 Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.457143 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed"} Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.457190 4936 scope.go:117] "RemoveContainer" containerID="ead175d706f67b3b024c6205f7558205ed4b209953fac12acb50b921ee784fc6" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.500773 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.500825 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.500837 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.500853 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.500864 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:30Z","lastTransitionTime":"2025-09-30T13:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.595796 4936 scope.go:117] "RemoveContainer" containerID="32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed" Sep 30 13:39:30 crc kubenswrapper[4936]: E0930 13:39:30.596025 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.602996 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.603040 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.603053 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.603069 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.603080 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:30Z","lastTransitionTime":"2025-09-30T13:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.706057 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.706393 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.706541 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.706642 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.706733 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:30Z","lastTransitionTime":"2025-09-30T13:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.808639 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.808663 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.808672 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.808684 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.808692 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:30Z","lastTransitionTime":"2025-09-30T13:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.911441 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.911471 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.911481 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.911497 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:30 crc kubenswrapper[4936]: I0930 13:39:30.911507 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:30Z","lastTransitionTime":"2025-09-30T13:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.013745 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.013809 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.013819 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.013831 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.013845 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:31Z","lastTransitionTime":"2025-09-30T13:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.116300 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.116356 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.116368 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.116382 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.116393 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:31Z","lastTransitionTime":"2025-09-30T13:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.218284 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.218319 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.218377 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.218392 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.218401 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:31Z","lastTransitionTime":"2025-09-30T13:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.224437 4936 apiserver.go:52] "Watching apiserver" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.228169 4936 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.228537 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/iptables-alerter-4ln5h","openshift-ovn-kubernetes/ovnkube-node-7vnws","openshift-machine-config-operator/machine-config-daemon-wj4sz","openshift-kube-apiserver/kube-apiserver-crc","openshift-multus/multus-additional-cni-plugins-jzqxn","openshift-multus/multus-vxjrh","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-5zj44"] Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.228947 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.228960 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.229521 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.229522 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.229709 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.229504 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.229545 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.230052 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.230131 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5zj44" Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.230682 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.232457 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.233922 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.234724 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.234869 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.234958 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.235373 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.234958 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.236373 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.236544 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.236688 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.236799 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.236720 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.237480 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.237670 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/166715eb-a672-4111-b64e-626a0f7b0d74-ovnkube-script-lib\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.237705 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.237729 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.237766 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-etc-openvswitch\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.237789 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-node-log\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.237809 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/166715eb-a672-4111-b64e-626a0f7b0d74-env-overrides\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.237834 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.237860 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-systemd-units\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.237882 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-run-openvswitch\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.237908 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.237931 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.237952 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.237974 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.237998 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h77cz\" (UniqueName: \"kubernetes.io/projected/166715eb-a672-4111-b64e-626a0f7b0d74-kube-api-access-h77cz\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238025 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238049 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-kubelet\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238073 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-run-netns\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238095 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/166715eb-a672-4111-b64e-626a0f7b0d74-ovnkube-config\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238120 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/aa0af778-18be-4c3d-aa1a-0e15485c2aa2-hosts-file\") pod \"node-resolver-5zj44\" (UID: \"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\") " pod="openshift-dns/node-resolver-5zj44" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238144 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238168 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238193 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238263 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238288 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-slash\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238323 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-run-ovn\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238369 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-run-ovn-kubernetes\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238394 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238415 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-run-systemd\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238436 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-log-socket\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238457 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-cni-netd\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238463 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238477 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/166715eb-a672-4111-b64e-626a0f7b0d74-ovn-node-metrics-cert\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238500 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnlqh\" (UniqueName: \"kubernetes.io/projected/aa0af778-18be-4c3d-aa1a-0e15485c2aa2-kube-api-access-fnlqh\") pod \"node-resolver-5zj44\" (UID: \"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\") " pod="openshift-dns/node-resolver-5zj44" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238533 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-var-lib-openvswitch\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.238867 4936 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.238922 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:31.738904953 +0000 UTC m=+22.122907264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238956 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-cni-bin\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.238983 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.239012 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.239553 4936 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.239822 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.239886 4936 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.239946 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:31.73991398 +0000 UTC m=+22.123916281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.240430 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.240555 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.240699 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.240803 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.240924 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.240974 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.241412 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.241418 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.241437 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.241471 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.241524 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.241910 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.241915 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.242040 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.242158 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.242402 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.242532 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.242439 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.242470 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.242759 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.242856 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.245169 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.246667 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.247166 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.258628 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.258656 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.258666 4936 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.258718 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:31.758702485 +0000 UTC m=+22.142704786 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.260549 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.260579 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.260591 4936 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.260641 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:31.760625238 +0000 UTC m=+22.144627539 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.260924 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.263723 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.263876 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.266245 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.271960 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.277862 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.286954 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.287215 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:39:31 crc kubenswrapper[4936]: W0930 13:39:31.299125 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-6e5adf0fb05711f1dbb992ecce2b62bc388158130286f73f5ce5aa10f32e2345 WatchSource:0}: Error finding container 6e5adf0fb05711f1dbb992ecce2b62bc388158130286f73f5ce5aa10f32e2345: Status 404 returned error can't find the container with id 6e5adf0fb05711f1dbb992ecce2b62bc388158130286f73f5ce5aa10f32e2345 Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.299248 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.313063 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.321117 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.321361 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.321453 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.321535 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.321618 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:31Z","lastTransitionTime":"2025-09-30T13:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.330132 4936 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.335124 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.339841 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.339885 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.339908 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.339927 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.339944 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.339961 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.339983 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340004 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340024 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340047 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340077 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340097 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340116 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340135 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340155 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340175 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340195 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340218 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340240 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340259 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340280 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340299 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340320 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340366 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340389 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340412 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340433 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340539 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340567 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340592 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340625 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340647 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340669 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340698 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340719 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340740 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340759 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340780 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340798 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340817 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340836 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340858 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340875 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340893 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340912 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340932 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340953 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340971 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.340992 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341014 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341034 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341053 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341071 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341089 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341107 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341316 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341370 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341389 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341409 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341426 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341445 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341559 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341583 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341607 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341631 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341658 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341681 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341705 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341733 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341757 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341777 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341798 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341821 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341843 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341863 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341884 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341904 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341925 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341951 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341973 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.341995 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342018 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342039 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342060 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342080 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342102 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342122 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342217 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342241 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342262 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342284 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342306 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342327 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342372 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342396 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342418 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342437 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342459 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342487 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342507 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342531 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342555 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342576 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342596 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342616 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342659 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342683 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342705 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342727 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342748 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342767 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342787 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342807 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342829 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342850 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342871 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342893 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342914 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342936 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342959 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.342979 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.343004 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.343028 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.343049 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.343070 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.343092 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.343114 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.343135 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.343159 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.343182 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.343203 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.343226 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.343254 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.343276 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.343298 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.343320 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.343756 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.344130 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.344302 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.344471 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.344812 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.345102 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.345252 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.345276 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.345708 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.345732 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.346017 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.346029 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.346128 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.346256 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.346480 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.346675 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.346434 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.347041 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.347082 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.347097 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.347399 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.347962 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.348015 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.348207 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.348523 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.348742 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.348960 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.349052 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.349625 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.349840 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.350062 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.350079 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.350086 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.350433 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.350628 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.350732 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.350759 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.351104 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.351126 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.351217 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.351324 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.351513 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.351533 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.351911 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.352122 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.352384 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.352432 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.352781 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.353155 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.353157 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.353363 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.353483 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.353751 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.353973 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.354443 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.354518 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.354855 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.355117 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.355133 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.355496 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.355329 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.355781 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.355976 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.356102 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.356438 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:39:31.856415574 +0000 UTC m=+22.240417965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357524 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357561 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357580 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357601 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357619 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357638 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357655 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357673 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357694 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357712 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357728 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357744 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357759 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357774 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357790 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357806 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357822 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357837 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357852 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357868 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357882 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357897 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357915 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357938 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357960 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357981 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358002 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358023 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358042 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358061 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358077 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358094 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358112 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358128 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358145 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358371 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358387 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358405 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358421 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358439 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358456 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358473 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358489 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358505 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358520 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358535 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358550 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358565 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358583 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358598 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358616 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358631 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358648 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358664 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358680 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358698 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358712 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358760 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358783 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-system-cni-dir\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358801 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358817 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-426nz\" (UniqueName: \"kubernetes.io/projected/e09d215c-5c94-4b2a-bc68-c51a84b784a7-kube-api-access-426nz\") pod \"machine-config-daemon-wj4sz\" (UID: \"e09d215c-5c94-4b2a-bc68-c51a84b784a7\") " pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358839 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/166715eb-a672-4111-b64e-626a0f7b0d74-ovnkube-config\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358877 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fcsg\" (UniqueName: \"kubernetes.io/projected/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-kube-api-access-2fcsg\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358894 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-host-run-k8s-cni-cncf-io\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358912 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-host-run-multus-certs\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358931 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnlqh\" (UniqueName: \"kubernetes.io/projected/aa0af778-18be-4c3d-aa1a-0e15485c2aa2-kube-api-access-fnlqh\") pod \"node-resolver-5zj44\" (UID: \"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\") " pod="openshift-dns/node-resolver-5zj44" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359095 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-multus-socket-dir-parent\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359116 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-multus-conf-dir\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359135 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359151 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-system-cni-dir\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359166 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-os-release\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359184 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-node-log\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359200 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/166715eb-a672-4111-b64e-626a0f7b0d74-env-overrides\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359217 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/166715eb-a672-4111-b64e-626a0f7b0d74-ovnkube-script-lib\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359232 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9dbb1e3f-927e-4587-835e-b21370b33262-cni-binary-copy\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359247 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-host-var-lib-cni-multus\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359284 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-etc-openvswitch\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359300 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-host-var-lib-kubelet\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359315 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e09d215c-5c94-4b2a-bc68-c51a84b784a7-mcd-auth-proxy-config\") pod \"machine-config-daemon-wj4sz\" (UID: \"e09d215c-5c94-4b2a-bc68-c51a84b784a7\") " pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359348 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-run-openvswitch\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359364 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359379 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-host-var-lib-cni-bin\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359393 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9dbb1e3f-927e-4587-835e-b21370b33262-multus-daemon-config\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359414 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-systemd-units\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359430 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e09d215c-5c94-4b2a-bc68-c51a84b784a7-proxy-tls\") pod \"machine-config-daemon-wj4sz\" (UID: \"e09d215c-5c94-4b2a-bc68-c51a84b784a7\") " pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359458 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h77cz\" (UniqueName: \"kubernetes.io/projected/166715eb-a672-4111-b64e-626a0f7b0d74-kube-api-access-h77cz\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359476 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-kubelet\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359492 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-host-run-netns\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359510 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-run-netns\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359527 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/aa0af778-18be-4c3d-aa1a-0e15485c2aa2-hosts-file\") pod \"node-resolver-5zj44\" (UID: \"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\") " pod="openshift-dns/node-resolver-5zj44" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359544 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-run-ovn\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359561 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-run-ovn-kubernetes\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359589 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-cni-binary-copy\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359603 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-cnibin\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359669 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-slash\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359686 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/166715eb-a672-4111-b64e-626a0f7b0d74-ovn-node-metrics-cert\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359701 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-multus-cni-dir\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359718 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwk7d\" (UniqueName: \"kubernetes.io/projected/9dbb1e3f-927e-4587-835e-b21370b33262-kube-api-access-rwk7d\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359736 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-run-systemd\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359752 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-log-socket\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359768 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-cni-netd\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359795 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-var-lib-openvswitch\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359812 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-cni-bin\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359830 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-cnibin\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359845 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-hostroot\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359860 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-etc-kubernetes\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359887 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359902 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-os-release\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359917 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e09d215c-5c94-4b2a-bc68-c51a84b784a7-rootfs\") pod \"machine-config-daemon-wj4sz\" (UID: \"e09d215c-5c94-4b2a-bc68-c51a84b784a7\") " pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360000 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360012 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360023 4936 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360034 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360045 4936 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360057 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360068 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360080 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360091 4936 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360102 4936 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360113 4936 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360124 4936 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360136 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360148 4936 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360159 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360171 4936 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360205 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360215 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360225 4936 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360235 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360245 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360255 4936 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360265 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360275 4936 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360285 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360296 4936 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360305 4936 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360314 4936 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360323 4936 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360350 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360388 4936 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360398 4936 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360407 4936 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360416 4936 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360426 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360435 4936 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360445 4936 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360455 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360465 4936 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360475 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360485 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360494 4936 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360504 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360514 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360524 4936 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360533 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360543 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360552 4936 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360561 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360570 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360580 4936 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360589 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360598 4936 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360607 4936 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360617 4936 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360626 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360635 4936 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360646 4936 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360655 4936 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360665 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360674 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360683 4936 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.369630 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.372629 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-run-openvswitch\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.374365 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-systemd-units\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.375607 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-slash\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.376996 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-kubelet\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.377142 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-run-netns\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.377198 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/aa0af778-18be-4c3d-aa1a-0e15485c2aa2-hosts-file\") pod \"node-resolver-5zj44\" (UID: \"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\") " pod="openshift-dns/node-resolver-5zj44" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.377232 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-run-ovn\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.377263 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-run-ovn-kubernetes\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.377421 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-var-lib-openvswitch\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.377485 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-run-systemd\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.377513 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-log-socket\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.377539 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-cni-netd\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.377757 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.378805 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.379004 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/166715eb-a672-4111-b64e-626a0f7b0d74-ovnkube-config\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.357930 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358101 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358113 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358572 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.379922 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.358846 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359776 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.359807 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360072 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360149 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360227 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360301 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360422 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360442 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360630 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360843 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.360855 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.361044 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.361289 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.362374 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.363395 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.369861 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.380109 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.370384 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.370473 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.370657 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.370849 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.371052 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.371133 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.371242 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.371347 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.371470 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.371550 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.371768 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.371870 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.372058 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.372061 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.372290 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.372327 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.372387 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.372574 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.372631 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.373168 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.373360 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.373529 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.373732 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.373765 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.374109 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.374434 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.374457 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.375158 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.375205 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.372925 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.375510 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.375517 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.376093 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.376517 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.376642 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.377290 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.377856 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.378112 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.378578 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.379785 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.380244 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.380658 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.380853 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.381108 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.381136 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-cni-bin\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.381221 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.381305 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.381382 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.381380 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.381670 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.381783 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.381811 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.381980 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.382205 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.383626 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.383687 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-node-log\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.386080 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/166715eb-a672-4111-b64e-626a0f7b0d74-ovnkube-script-lib\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.386175 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/166715eb-a672-4111-b64e-626a0f7b0d74-env-overrides\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.388425 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-etc-openvswitch\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.389064 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/166715eb-a672-4111-b64e-626a0f7b0d74-ovn-node-metrics-cert\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.389131 4936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.382440 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.382700 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.383056 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.383553 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.383577 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.384043 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.384268 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.384283 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.384551 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.384775 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.385008 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.385031 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.385029 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.385202 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.385796 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.385941 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.386358 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.386380 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.386485 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.386567 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.386666 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.386762 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.386986 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.387059 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.387074 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.387426 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.387483 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.387617 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.387714 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.387880 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.387957 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.387884 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.388130 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.388429 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.388795 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.388881 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.389006 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.389137 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.389714 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.390392 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.391647 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.392021 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.392308 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.392800 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.392822 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.392958 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.393007 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.393055 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.393241 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.393268 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.393490 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.393804 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.393927 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.393985 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.394905 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.395763 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnlqh\" (UniqueName: \"kubernetes.io/projected/aa0af778-18be-4c3d-aa1a-0e15485c2aa2-kube-api-access-fnlqh\") pod \"node-resolver-5zj44\" (UID: \"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\") " pod="openshift-dns/node-resolver-5zj44" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.397664 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h77cz\" (UniqueName: \"kubernetes.io/projected/166715eb-a672-4111-b64e-626a0f7b0d74-kube-api-access-h77cz\") pod \"ovnkube-node-7vnws\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.398216 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.400316 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.411560 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ead175d706f67b3b024c6205f7558205ed4b209953fac12acb50b921ee784fc6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:24Z\\\",\\\"message\\\":\\\"W0930 13:39:13.814393 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 13:39:13.814819 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759239553 cert, and key in /tmp/serving-cert-2495725929/serving-signer.crt, /tmp/serving-cert-2495725929/serving-signer.key\\\\nI0930 13:39:14.053045 1 observer_polling.go:159] Starting file observer\\\\nW0930 13:39:14.056982 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 13:39:14.057255 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:14.057963 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2495725929/tls.crt::/tmp/serving-cert-2495725929/tls.key\\\\\\\"\\\\nF0930 13:39:24.343551 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.415677 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.419655 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.420385 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.426543 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.426578 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.426587 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.426600 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.426612 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:31Z","lastTransitionTime":"2025-09-30T13:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.428559 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.444395 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.452101 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461076 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461247 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-host-run-netns\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461287 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-cnibin\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461350 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-cni-binary-copy\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461383 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-multus-cni-dir\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461397 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-cnibin\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461400 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwk7d\" (UniqueName: \"kubernetes.io/projected/9dbb1e3f-927e-4587-835e-b21370b33262-kube-api-access-rwk7d\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461449 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-cnibin\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461466 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-hostroot\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461480 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-etc-kubernetes\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461518 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-os-release\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461533 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e09d215c-5c94-4b2a-bc68-c51a84b784a7-rootfs\") pod \"machine-config-daemon-wj4sz\" (UID: \"e09d215c-5c94-4b2a-bc68-c51a84b784a7\") " pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461551 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-system-cni-dir\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461566 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461583 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-426nz\" (UniqueName: \"kubernetes.io/projected/e09d215c-5c94-4b2a-bc68-c51a84b784a7-kube-api-access-426nz\") pod \"machine-config-daemon-wj4sz\" (UID: \"e09d215c-5c94-4b2a-bc68-c51a84b784a7\") " pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461609 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-host-run-k8s-cni-cncf-io\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461629 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fcsg\" (UniqueName: \"kubernetes.io/projected/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-kube-api-access-2fcsg\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461643 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-host-run-multus-certs\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461659 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-system-cni-dir\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461675 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-os-release\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461689 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-multus-socket-dir-parent\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461703 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-multus-conf-dir\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461717 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-host-var-lib-cni-multus\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461754 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9dbb1e3f-927e-4587-835e-b21370b33262-cni-binary-copy\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461769 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-host-var-lib-kubelet\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461783 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e09d215c-5c94-4b2a-bc68-c51a84b784a7-mcd-auth-proxy-config\") pod \"machine-config-daemon-wj4sz\" (UID: \"e09d215c-5c94-4b2a-bc68-c51a84b784a7\") " pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461801 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461817 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-host-var-lib-cni-bin\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461831 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9dbb1e3f-927e-4587-835e-b21370b33262-multus-daemon-config\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461859 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e09d215c-5c94-4b2a-bc68-c51a84b784a7-proxy-tls\") pod \"machine-config-daemon-wj4sz\" (UID: \"e09d215c-5c94-4b2a-bc68-c51a84b784a7\") " pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461925 4936 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461935 4936 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461947 4936 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461956 4936 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461965 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461974 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461984 4936 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461993 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462002 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462010 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462019 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462027 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462035 4936 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462043 4936 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462051 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462060 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462068 4936 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462077 4936 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462085 4936 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462094 4936 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462122 4936 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462131 4936 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462139 4936 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462147 4936 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462155 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462164 4936 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462172 4936 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462181 4936 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462178 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-cni-binary-copy\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462216 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-cnibin\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462248 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-hostroot\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462267 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-etc-kubernetes\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462306 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-os-release\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462325 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e09d215c-5c94-4b2a-bc68-c51a84b784a7-rootfs\") pod \"machine-config-daemon-wj4sz\" (UID: \"e09d215c-5c94-4b2a-bc68-c51a84b784a7\") " pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462360 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-system-cni-dir\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462829 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462933 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-host-var-lib-cni-multus\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463193 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-host-run-k8s-cni-cncf-io\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.461352 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-host-run-netns\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463376 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-host-run-multus-certs\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463416 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-system-cni-dir\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463455 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-os-release\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463496 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-multus-socket-dir-parent\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463536 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-multus-conf-dir\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463697 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-multus-cni-dir\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.462189 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463721 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9dbb1e3f-927e-4587-835e-b21370b33262-cni-binary-copy\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463734 4936 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463748 4936 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463751 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-host-var-lib-kubelet\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463761 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463774 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463786 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463798 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463810 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463822 4936 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463833 4936 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463845 4936 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463857 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463869 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463880 4936 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463891 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463903 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463915 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463929 4936 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463941 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463954 4936 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463966 4936 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463978 4936 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.463990 4936 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464001 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464013 4936 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464026 4936 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464037 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464050 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464061 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464073 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464085 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464099 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464110 4936 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464122 4936 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464132 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464144 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464155 4936 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464158 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e09d215c-5c94-4b2a-bc68-c51a84b784a7-mcd-auth-proxy-config\") pod \"machine-config-daemon-wj4sz\" (UID: \"e09d215c-5c94-4b2a-bc68-c51a84b784a7\") " pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464166 4936 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464175 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9dbb1e3f-927e-4587-835e-b21370b33262-host-var-lib-cni-bin\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464182 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464453 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464463 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464472 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464481 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464490 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464498 4936 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464506 4936 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464514 4936 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464522 4936 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464530 4936 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464538 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464546 4936 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464553 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464562 4936 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464570 4936 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464578 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464586 4936 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464596 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464604 4936 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464612 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464620 4936 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464628 4936 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464636 4936 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464645 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464654 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464662 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464671 4936 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464679 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464687 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464685 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9dbb1e3f-927e-4587-835e-b21370b33262-multus-daemon-config\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464697 4936 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464706 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464715 4936 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464723 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464733 4936 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464741 4936 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464749 4936 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464759 4936 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464768 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464776 4936 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464784 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464791 4936 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464800 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464808 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464816 4936 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464824 4936 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464833 4936 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464840 4936 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464848 4936 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464856 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464864 4936 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464871 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464879 4936 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464887 4936 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464888 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464895 4936 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464923 4936 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464934 4936 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464946 4936 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464957 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464967 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464978 4936 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.464989 4936 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.465003 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.465016 4936 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.465029 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.465041 4936 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.465364 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e09d215c-5c94-4b2a-bc68-c51a84b784a7-proxy-tls\") pod \"machine-config-daemon-wj4sz\" (UID: \"e09d215c-5c94-4b2a-bc68-c51a84b784a7\") " pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.465975 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.476104 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.479920 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-426nz\" (UniqueName: \"kubernetes.io/projected/e09d215c-5c94-4b2a-bc68-c51a84b784a7-kube-api-access-426nz\") pod \"machine-config-daemon-wj4sz\" (UID: \"e09d215c-5c94-4b2a-bc68-c51a84b784a7\") " pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.480318 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwk7d\" (UniqueName: \"kubernetes.io/projected/9dbb1e3f-927e-4587-835e-b21370b33262-kube-api-access-rwk7d\") pod \"multus-vxjrh\" (UID: \"9dbb1e3f-927e-4587-835e-b21370b33262\") " pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.480417 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6e5adf0fb05711f1dbb992ecce2b62bc388158130286f73f5ce5aa10f32e2345"} Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.480494 4936 scope.go:117] "RemoveContainer" containerID="32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed" Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.480675 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.482608 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fcsg\" (UniqueName: \"kubernetes.io/projected/6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf-kube-api-access-2fcsg\") pod \"multus-additional-cni-plugins-jzqxn\" (UID: \"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\") " pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.490674 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.500912 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.510899 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.521804 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.528840 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.528889 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.528903 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.528917 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.528925 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:31Z","lastTransitionTime":"2025-09-30T13:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.533857 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.544298 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.554467 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.556580 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.563522 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.581694 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.593836 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5zj44" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.597037 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.610620 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.627689 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vxjrh" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.631522 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.631550 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.631557 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.631575 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.631584 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:31Z","lastTransitionTime":"2025-09-30T13:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.641316 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.643991 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.659523 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.661026 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.667373 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.675673 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:39:31 crc kubenswrapper[4936]: W0930 13:39:31.692702 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bd1cdd6_e1f0_4750_849e_8e22bbc7fedf.slice/crio-55d9f1053782f76c2a8aa1930fd515b181309acd7ec6bf1bba236a5d14f748d7 WatchSource:0}: Error finding container 55d9f1053782f76c2a8aa1930fd515b181309acd7ec6bf1bba236a5d14f748d7: Status 404 returned error can't find the container with id 55d9f1053782f76c2a8aa1930fd515b181309acd7ec6bf1bba236a5d14f748d7 Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.734433 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.734464 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.734473 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.734487 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.734496 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:31Z","lastTransitionTime":"2025-09-30T13:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.766923 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.766955 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.766987 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.767003 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.767101 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.767116 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.767126 4936 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.767166 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:32.767153482 +0000 UTC m=+23.151155783 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.767399 4936 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.767469 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:32.76745242 +0000 UTC m=+23.151454711 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.767509 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.767521 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.767529 4936 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.767552 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:32.767545552 +0000 UTC m=+23.151547853 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.767584 4936 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.767608 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:32.767600304 +0000 UTC m=+23.151602605 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.838564 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.838598 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.838605 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.838619 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.838629 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:31Z","lastTransitionTime":"2025-09-30T13:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.867299 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:39:31 crc kubenswrapper[4936]: E0930 13:39:31.867456 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:39:32.86743475 +0000 UTC m=+23.251437051 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.940913 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.940943 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.940954 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.940970 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:31 crc kubenswrapper[4936]: I0930 13:39:31.940981 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:31Z","lastTransitionTime":"2025-09-30T13:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.042578 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.042612 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.042623 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.042639 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.042649 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:32Z","lastTransitionTime":"2025-09-30T13:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.146919 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.146968 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.146978 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.146992 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.147002 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:32Z","lastTransitionTime":"2025-09-30T13:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.249437 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.249476 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.249487 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.249502 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.249512 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:32Z","lastTransitionTime":"2025-09-30T13:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.322723 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.324146 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.326474 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.328060 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.329388 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.330553 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.331960 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.333171 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.334689 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.335944 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.337892 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.340155 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.342268 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.343071 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.343643 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.344266 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.344884 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.345410 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.346026 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.347748 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.348946 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.350463 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.352122 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.352172 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.352188 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.352212 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.352231 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:32Z","lastTransitionTime":"2025-09-30T13:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.352256 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.352913 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.353381 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.354018 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.355843 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.356918 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.358817 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.359414 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.359881 4936 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.359999 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.362014 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.362576 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.363416 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.364885 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.365530 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.366444 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.367096 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.368103 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.368602 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.369585 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.370279 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.371214 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.371712 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.372659 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.373207 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.374466 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.375160 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.376748 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.377891 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.378711 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.380981 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.382185 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.454725 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.454782 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.454801 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.454823 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.454840 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:32Z","lastTransitionTime":"2025-09-30T13:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.484541 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5fccaeb5ff76dcea741597c7814153e1c08410009b0396c9b8940f2ae1e7c163"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.486920 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.487280 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"cd4ad06f0a3223f97f996a18d21b083b373cafffe54f6d7ccfde15b1d2600e38"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.489503 4936 generic.go:334] "Generic (PLEG): container finished" podID="166715eb-a672-4111-b64e-626a0f7b0d74" containerID="859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37" exitCode=0 Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.489630 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerDied","Data":"859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.490030 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerStarted","Data":"493dfd956fcb83a3139b4a2529fcc5a83c028b75cd93df862ba9bcfaaea7bfc0"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.492481 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5zj44" event={"ID":"aa0af778-18be-4c3d-aa1a-0e15485c2aa2","Type":"ContainerStarted","Data":"cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.492524 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5zj44" event={"ID":"aa0af778-18be-4c3d-aa1a-0e15485c2aa2","Type":"ContainerStarted","Data":"43bde691ecc804b23f891f0c08d4db8cff198b41f0ea31a4c3f70d708eb8f996"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.494406 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.494442 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.496738 4936 generic.go:334] "Generic (PLEG): container finished" podID="6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf" containerID="16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70" exitCode=0 Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.496794 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" event={"ID":"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf","Type":"ContainerDied","Data":"16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.497011 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" event={"ID":"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf","Type":"ContainerStarted","Data":"55d9f1053782f76c2a8aa1930fd515b181309acd7ec6bf1bba236a5d14f748d7"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.502071 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vxjrh" event={"ID":"9dbb1e3f-927e-4587-835e-b21370b33262","Type":"ContainerStarted","Data":"0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.502134 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vxjrh" event={"ID":"9dbb1e3f-927e-4587-835e-b21370b33262","Type":"ContainerStarted","Data":"06af4f55fbafe2502b7d7c5c68bb90bf36e9e19b88bd1c95e6e86c6bcaf4a875"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.504734 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.504786 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.504803 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"a9d905a43e301d66983932f144a92d421c35c5931afac3ed5ba0e5f58fd80d3a"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.505171 4936 scope.go:117] "RemoveContainer" containerID="32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed" Sep 30 13:39:32 crc kubenswrapper[4936]: E0930 13:39:32.505325 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.518385 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.542890 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.557406 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.560532 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.560572 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.560583 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.560600 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.560611 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:32Z","lastTransitionTime":"2025-09-30T13:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.587854 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.600595 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.613415 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.626042 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.637030 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.648452 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.659368 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.663813 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.663850 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.663860 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.663876 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.663884 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:32Z","lastTransitionTime":"2025-09-30T13:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.672287 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.690889 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.705852 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.720877 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.734077 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.745488 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.764042 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.765656 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.765678 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.765688 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.765703 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.765712 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:32Z","lastTransitionTime":"2025-09-30T13:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.776468 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.776499 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.776533 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.776549 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:32 crc kubenswrapper[4936]: E0930 13:39:32.776583 4936 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:39:32 crc kubenswrapper[4936]: E0930 13:39:32.776670 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:34.77665311 +0000 UTC m=+25.160655411 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:39:32 crc kubenswrapper[4936]: E0930 13:39:32.776670 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:39:32 crc kubenswrapper[4936]: E0930 13:39:32.776689 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:39:32 crc kubenswrapper[4936]: E0930 13:39:32.776700 4936 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:32 crc kubenswrapper[4936]: E0930 13:39:32.776670 4936 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:39:32 crc kubenswrapper[4936]: E0930 13:39:32.776772 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:39:32 crc kubenswrapper[4936]: E0930 13:39:32.776783 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:39:32 crc kubenswrapper[4936]: E0930 13:39:32.776790 4936 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:32 crc kubenswrapper[4936]: E0930 13:39:32.776735 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:34.776721162 +0000 UTC m=+25.160723463 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:32 crc kubenswrapper[4936]: E0930 13:39:32.776851 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:34.776840775 +0000 UTC m=+25.160843076 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:32 crc kubenswrapper[4936]: E0930 13:39:32.776864 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:34.776858496 +0000 UTC m=+25.160860787 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.779524 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.794003 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.805014 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.818255 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.829879 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.842323 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.854237 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.867723 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.867757 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.867766 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.867779 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.867790 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:32Z","lastTransitionTime":"2025-09-30T13:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.877263 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:39:32 crc kubenswrapper[4936]: E0930 13:39:32.877535 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:39:34.877507445 +0000 UTC m=+25.261509746 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.920797 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-fx6ff"] Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.921161 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fx6ff" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.925165 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.925260 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.925904 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.925979 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.938802 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.955066 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.973178 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.973217 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.973227 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.973242 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.973263 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:32Z","lastTransitionTime":"2025-09-30T13:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.976800 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.978138 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be826921-6363-4c9a-9167-3af8e59e042d-host\") pod \"node-ca-fx6ff\" (UID: \"be826921-6363-4c9a-9167-3af8e59e042d\") " pod="openshift-image-registry/node-ca-fx6ff" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.978302 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wzfh\" (UniqueName: \"kubernetes.io/projected/be826921-6363-4c9a-9167-3af8e59e042d-kube-api-access-4wzfh\") pod \"node-ca-fx6ff\" (UID: \"be826921-6363-4c9a-9167-3af8e59e042d\") " pod="openshift-image-registry/node-ca-fx6ff" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.978430 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/be826921-6363-4c9a-9167-3af8e59e042d-serviceca\") pod \"node-ca-fx6ff\" (UID: \"be826921-6363-4c9a-9167-3af8e59e042d\") " pod="openshift-image-registry/node-ca-fx6ff" Sep 30 13:39:32 crc kubenswrapper[4936]: I0930 13:39:32.997869 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.009843 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.023120 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.035522 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.047154 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.059743 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.073404 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.075860 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.075893 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.075902 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.075914 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.075923 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:33Z","lastTransitionTime":"2025-09-30T13:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.079827 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wzfh\" (UniqueName: \"kubernetes.io/projected/be826921-6363-4c9a-9167-3af8e59e042d-kube-api-access-4wzfh\") pod \"node-ca-fx6ff\" (UID: \"be826921-6363-4c9a-9167-3af8e59e042d\") " pod="openshift-image-registry/node-ca-fx6ff" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.079866 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/be826921-6363-4c9a-9167-3af8e59e042d-serviceca\") pod \"node-ca-fx6ff\" (UID: \"be826921-6363-4c9a-9167-3af8e59e042d\") " pod="openshift-image-registry/node-ca-fx6ff" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.079914 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be826921-6363-4c9a-9167-3af8e59e042d-host\") pod \"node-ca-fx6ff\" (UID: \"be826921-6363-4c9a-9167-3af8e59e042d\") " pod="openshift-image-registry/node-ca-fx6ff" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.079969 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be826921-6363-4c9a-9167-3af8e59e042d-host\") pod \"node-ca-fx6ff\" (UID: \"be826921-6363-4c9a-9167-3af8e59e042d\") " pod="openshift-image-registry/node-ca-fx6ff" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.080772 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/be826921-6363-4c9a-9167-3af8e59e042d-serviceca\") pod \"node-ca-fx6ff\" (UID: \"be826921-6363-4c9a-9167-3af8e59e042d\") " pod="openshift-image-registry/node-ca-fx6ff" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.089587 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.100294 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wzfh\" (UniqueName: \"kubernetes.io/projected/be826921-6363-4c9a-9167-3af8e59e042d-kube-api-access-4wzfh\") pod \"node-ca-fx6ff\" (UID: \"be826921-6363-4c9a-9167-3af8e59e042d\") " pod="openshift-image-registry/node-ca-fx6ff" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.104176 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.119761 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.178633 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.178677 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.178689 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.178707 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.178718 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:33Z","lastTransitionTime":"2025-09-30T13:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.232125 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fx6ff" Sep 30 13:39:33 crc kubenswrapper[4936]: W0930 13:39:33.252058 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe826921_6363_4c9a_9167_3af8e59e042d.slice/crio-f6acd8142974adac646e153483a1754bbdd60e1468da6b9ac454ef7990f8393c WatchSource:0}: Error finding container f6acd8142974adac646e153483a1754bbdd60e1468da6b9ac454ef7990f8393c: Status 404 returned error can't find the container with id f6acd8142974adac646e153483a1754bbdd60e1468da6b9ac454ef7990f8393c Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.283034 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.283073 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.283084 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.283103 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.283116 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:33Z","lastTransitionTime":"2025-09-30T13:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.314502 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:33 crc kubenswrapper[4936]: E0930 13:39:33.314659 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.314951 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:33 crc kubenswrapper[4936]: E0930 13:39:33.315000 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.315032 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:33 crc kubenswrapper[4936]: E0930 13:39:33.315068 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.385216 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.385506 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.385526 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.385541 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.385549 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:33Z","lastTransitionTime":"2025-09-30T13:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.487021 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.487048 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.487056 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.487071 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.487081 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:33Z","lastTransitionTime":"2025-09-30T13:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.515722 4936 generic.go:334] "Generic (PLEG): container finished" podID="6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf" containerID="b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7" exitCode=0 Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.515786 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" event={"ID":"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf","Type":"ContainerDied","Data":"b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7"} Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.520322 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerStarted","Data":"a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1"} Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.520380 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerStarted","Data":"7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495"} Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.520395 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerStarted","Data":"6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f"} Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.520408 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerStarted","Data":"e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087"} Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.524013 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fx6ff" event={"ID":"be826921-6363-4c9a-9167-3af8e59e042d","Type":"ContainerStarted","Data":"f6acd8142974adac646e153483a1754bbdd60e1468da6b9ac454ef7990f8393c"} Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.530068 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.544382 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.574196 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.587032 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.590521 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.590553 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.590563 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.590578 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.590588 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:33Z","lastTransitionTime":"2025-09-30T13:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.603621 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.618801 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.632271 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.648875 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.667742 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.682024 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.696740 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.696796 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.696805 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.696819 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.696827 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:33Z","lastTransitionTime":"2025-09-30T13:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.698833 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.712585 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.732020 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.802517 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.802541 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.802549 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.802562 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.802570 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:33Z","lastTransitionTime":"2025-09-30T13:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.904496 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.904524 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.904533 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.904547 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:33 crc kubenswrapper[4936]: I0930 13:39:33.904557 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:33Z","lastTransitionTime":"2025-09-30T13:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.007239 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.007267 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.007280 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.007297 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.007309 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:34Z","lastTransitionTime":"2025-09-30T13:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.109703 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.109748 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.109759 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.109775 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.109785 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:34Z","lastTransitionTime":"2025-09-30T13:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.211656 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.211699 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.211709 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.211724 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.211736 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:34Z","lastTransitionTime":"2025-09-30T13:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.313740 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.313772 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.313783 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.313797 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.313807 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:34Z","lastTransitionTime":"2025-09-30T13:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.415799 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.415832 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.415840 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.415854 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.415864 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:34Z","lastTransitionTime":"2025-09-30T13:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.490000 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.494130 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.498809 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.507106 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.517947 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.517978 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.517988 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.518004 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.518015 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:34Z","lastTransitionTime":"2025-09-30T13:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.519683 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.527583 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1"} Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.528667 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fx6ff" event={"ID":"be826921-6363-4c9a-9167-3af8e59e042d","Type":"ContainerStarted","Data":"142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727"} Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.530356 4936 generic.go:334] "Generic (PLEG): container finished" podID="6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf" containerID="8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d" exitCode=0 Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.530414 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" event={"ID":"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf","Type":"ContainerDied","Data":"8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d"} Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.533798 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerStarted","Data":"70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075"} Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.533902 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerStarted","Data":"add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183"} Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.544289 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.561709 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.577815 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.589486 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.600044 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.609539 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.618639 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.619842 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.619864 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.619871 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.620013 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.620041 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:34Z","lastTransitionTime":"2025-09-30T13:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.628639 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.637572 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.651925 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.666942 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.679090 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.691416 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.701892 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.717542 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.721842 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.721873 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.721883 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.721899 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.721911 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:34Z","lastTransitionTime":"2025-09-30T13:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.730498 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.743170 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.757202 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.767560 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.776790 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.787011 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.795960 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.796004 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.796046 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.796095 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:34 crc kubenswrapper[4936]: E0930 13:39:34.796221 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:39:34 crc kubenswrapper[4936]: E0930 13:39:34.796239 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:39:34 crc kubenswrapper[4936]: E0930 13:39:34.796251 4936 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:34 crc kubenswrapper[4936]: E0930 13:39:34.796299 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:38.796282066 +0000 UTC m=+29.180284367 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:34 crc kubenswrapper[4936]: E0930 13:39:34.796425 4936 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:39:34 crc kubenswrapper[4936]: E0930 13:39:34.796477 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:38.796466991 +0000 UTC m=+29.180469292 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:39:34 crc kubenswrapper[4936]: E0930 13:39:34.796530 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:39:34 crc kubenswrapper[4936]: E0930 13:39:34.796546 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:39:34 crc kubenswrapper[4936]: E0930 13:39:34.796555 4936 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:34 crc kubenswrapper[4936]: E0930 13:39:34.796560 4936 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:39:34 crc kubenswrapper[4936]: E0930 13:39:34.796583 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:38.796574584 +0000 UTC m=+29.180576885 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:34 crc kubenswrapper[4936]: E0930 13:39:34.796660 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:38.796644096 +0000 UTC m=+29.180646397 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.799819 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.809562 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.817557 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.823946 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.823979 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.823998 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.824015 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.824026 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:34Z","lastTransitionTime":"2025-09-30T13:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.828466 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.896969 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:39:34 crc kubenswrapper[4936]: E0930 13:39:34.897270 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:39:38.897250264 +0000 UTC m=+29.281252565 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.926555 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.926588 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.926598 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.926611 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:34 crc kubenswrapper[4936]: I0930 13:39:34.926619 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:34Z","lastTransitionTime":"2025-09-30T13:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.029376 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.029846 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.029914 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.029986 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.030068 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:35Z","lastTransitionTime":"2025-09-30T13:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.132195 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.132248 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.132263 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.132319 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.132355 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:35Z","lastTransitionTime":"2025-09-30T13:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.234773 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.234818 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.234829 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.234842 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.234851 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:35Z","lastTransitionTime":"2025-09-30T13:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.314899 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.314910 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:35 crc kubenswrapper[4936]: E0930 13:39:35.315103 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.314935 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:35 crc kubenswrapper[4936]: E0930 13:39:35.315207 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:39:35 crc kubenswrapper[4936]: E0930 13:39:35.315285 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.336727 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.336752 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.336760 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.336775 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.336783 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:35Z","lastTransitionTime":"2025-09-30T13:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.439359 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.439400 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.439409 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.439427 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.439437 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:35Z","lastTransitionTime":"2025-09-30T13:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.538677 4936 generic.go:334] "Generic (PLEG): container finished" podID="6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf" containerID="7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe" exitCode=0 Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.538755 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" event={"ID":"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf","Type":"ContainerDied","Data":"7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe"} Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.541548 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.541594 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.541609 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.541628 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.541642 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:35Z","lastTransitionTime":"2025-09-30T13:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.564150 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:35Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.582021 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:35Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.593750 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:35Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.605880 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:35Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.619772 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:35Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.630998 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:35Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.642894 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:35Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.644997 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.645036 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.645045 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.645061 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.645070 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:35Z","lastTransitionTime":"2025-09-30T13:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.660620 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:35Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.670045 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:35Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.680514 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:35Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.690872 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:35Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.702092 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:35Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.713198 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:35Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.728305 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:35Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.748077 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.748135 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.748146 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.748161 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.748192 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:35Z","lastTransitionTime":"2025-09-30T13:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.850262 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.850287 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.850295 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.850307 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.850315 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:35Z","lastTransitionTime":"2025-09-30T13:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.952713 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.952761 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.952774 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.952792 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:35 crc kubenswrapper[4936]: I0930 13:39:35.952806 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:35Z","lastTransitionTime":"2025-09-30T13:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.054877 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.054919 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.054929 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.054947 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.054974 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:36Z","lastTransitionTime":"2025-09-30T13:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.157410 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.157469 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.157480 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.157494 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.157503 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:36Z","lastTransitionTime":"2025-09-30T13:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.260048 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.260076 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.260089 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.260102 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.260111 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:36Z","lastTransitionTime":"2025-09-30T13:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.362379 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.362416 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.362426 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.362441 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.362451 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:36Z","lastTransitionTime":"2025-09-30T13:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.464988 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.465040 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.465055 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.465074 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.465086 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:36Z","lastTransitionTime":"2025-09-30T13:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.544292 4936 generic.go:334] "Generic (PLEG): container finished" podID="6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf" containerID="45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b" exitCode=0 Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.544406 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" event={"ID":"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf","Type":"ContainerDied","Data":"45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b"} Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.548633 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerStarted","Data":"514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb"} Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.566248 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:36Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.567079 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.567134 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.567145 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.567160 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.567188 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:36Z","lastTransitionTime":"2025-09-30T13:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.582995 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:36Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.594145 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:36Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.611944 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:36Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.623585 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:36Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.635620 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:36Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.648216 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:36Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.658573 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:36Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.669417 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.669460 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.669471 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.669485 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.669495 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:36Z","lastTransitionTime":"2025-09-30T13:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.670843 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:36Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.683303 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:36Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.695124 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:36Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.705450 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:36Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.717312 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:36Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.732842 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:36Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.771526 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.771554 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.771562 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.771575 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.771583 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:36Z","lastTransitionTime":"2025-09-30T13:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.873731 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.873761 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.873768 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.873781 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.873790 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:36Z","lastTransitionTime":"2025-09-30T13:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.976406 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.976435 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.976443 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.976456 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:36 crc kubenswrapper[4936]: I0930 13:39:36.976466 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:36Z","lastTransitionTime":"2025-09-30T13:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.079021 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.079246 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.079395 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.079473 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.079532 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:37Z","lastTransitionTime":"2025-09-30T13:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.182150 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.182195 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.182207 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.182227 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.182241 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:37Z","lastTransitionTime":"2025-09-30T13:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.285181 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.285516 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.285651 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.285759 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.285880 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:37Z","lastTransitionTime":"2025-09-30T13:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.314532 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.314542 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:37 crc kubenswrapper[4936]: E0930 13:39:37.314745 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:39:37 crc kubenswrapper[4936]: E0930 13:39:37.314670 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.314562 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:37 crc kubenswrapper[4936]: E0930 13:39:37.315001 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.389058 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.389350 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.389630 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.389770 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.389882 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:37Z","lastTransitionTime":"2025-09-30T13:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.493729 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.493769 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.493791 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.493809 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.493820 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:37Z","lastTransitionTime":"2025-09-30T13:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.555854 4936 generic.go:334] "Generic (PLEG): container finished" podID="6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf" containerID="0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00" exitCode=0 Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.555894 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" event={"ID":"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf","Type":"ContainerDied","Data":"0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00"} Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.577894 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.591304 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.595463 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.595489 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.595498 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.595510 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.595520 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:37Z","lastTransitionTime":"2025-09-30T13:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.604237 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.622266 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.639093 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.657460 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.671616 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.684839 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.697823 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.697864 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.697875 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.697893 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.697904 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:37Z","lastTransitionTime":"2025-09-30T13:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.698140 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.711775 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.728285 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.740508 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.753025 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.770055 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.801009 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.801043 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.801051 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.801064 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.801073 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:37Z","lastTransitionTime":"2025-09-30T13:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.903231 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.903259 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.903267 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.903278 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:37 crc kubenswrapper[4936]: I0930 13:39:37.903288 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:37Z","lastTransitionTime":"2025-09-30T13:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.005306 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.005365 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.005373 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.005386 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.005395 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:38Z","lastTransitionTime":"2025-09-30T13:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.108302 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.108410 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.108435 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.108460 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.108478 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:38Z","lastTransitionTime":"2025-09-30T13:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.210655 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.211010 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.211024 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.211044 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.211058 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:38Z","lastTransitionTime":"2025-09-30T13:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.313249 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.313283 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.313411 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.313437 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.313445 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:38Z","lastTransitionTime":"2025-09-30T13:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.416134 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.416179 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.416196 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.416220 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.416237 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:38Z","lastTransitionTime":"2025-09-30T13:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.519099 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.519133 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.519142 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.519155 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.519164 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:38Z","lastTransitionTime":"2025-09-30T13:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.562525 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" event={"ID":"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf","Type":"ContainerStarted","Data":"63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3"} Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.566328 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerStarted","Data":"7a09f85fd216b9dbfe662a23570d83cda52f735c9997f1ad745e5cbf5fec5f89"} Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.566689 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.566734 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.576196 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.589065 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.592004 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.596390 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.599281 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.613862 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.621716 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.622038 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.622127 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.622228 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.622326 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:38Z","lastTransitionTime":"2025-09-30T13:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.627514 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.642655 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.653877 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.663087 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.675094 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.688915 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.702072 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.711406 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.723847 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.725438 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.725495 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.725507 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.725524 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.725541 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:38Z","lastTransitionTime":"2025-09-30T13:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.741975 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.756655 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.770668 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.781410 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.799875 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a09f85fd216b9dbfe662a23570d83cda52f735c9997f1ad745e5cbf5fec5f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.811171 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.822866 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.827577 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.827619 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.827627 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.827643 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.827653 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:38Z","lastTransitionTime":"2025-09-30T13:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.835469 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.835503 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.835531 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.835551 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:38 crc kubenswrapper[4936]: E0930 13:39:38.835653 4936 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:39:38 crc kubenswrapper[4936]: E0930 13:39:38.835666 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:39:38 crc kubenswrapper[4936]: E0930 13:39:38.835697 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:46.835684201 +0000 UTC m=+37.219686493 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:39:38 crc kubenswrapper[4936]: E0930 13:39:38.835696 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:39:38 crc kubenswrapper[4936]: E0930 13:39:38.835715 4936 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:38 crc kubenswrapper[4936]: E0930 13:39:38.835765 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:46.835748313 +0000 UTC m=+37.219750614 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:38 crc kubenswrapper[4936]: E0930 13:39:38.835782 4936 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:39:38 crc kubenswrapper[4936]: E0930 13:39:38.835820 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:39:38 crc kubenswrapper[4936]: E0930 13:39:38.835832 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:39:38 crc kubenswrapper[4936]: E0930 13:39:38.835843 4936 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:38 crc kubenswrapper[4936]: E0930 13:39:38.835822 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:46.835811655 +0000 UTC m=+37.219813956 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:39:38 crc kubenswrapper[4936]: E0930 13:39:38.835880 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:46.835872967 +0000 UTC m=+37.219875268 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.836072 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.846973 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.859719 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.880122 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.893008 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.904429 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.920902 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.929761 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.929789 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.929798 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.929811 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.929821 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:38Z","lastTransitionTime":"2025-09-30T13:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.936086 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:39:38 crc kubenswrapper[4936]: E0930 13:39:38.936223 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:39:46.936206857 +0000 UTC m=+37.320209158 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:39:38 crc kubenswrapper[4936]: I0930 13:39:38.937088 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.034089 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.034148 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.034162 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.034178 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.034187 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:39Z","lastTransitionTime":"2025-09-30T13:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.139074 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.139153 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.139173 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.139565 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.139605 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:39Z","lastTransitionTime":"2025-09-30T13:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.242205 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.242291 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.242303 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.242318 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.242355 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:39Z","lastTransitionTime":"2025-09-30T13:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.314783 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.314884 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:39 crc kubenswrapper[4936]: E0930 13:39:39.314912 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:39:39 crc kubenswrapper[4936]: E0930 13:39:39.315009 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.314808 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:39 crc kubenswrapper[4936]: E0930 13:39:39.315080 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.345055 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.345117 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.345129 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.345143 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.345152 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:39Z","lastTransitionTime":"2025-09-30T13:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.448178 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.448235 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.448246 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.448261 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.448271 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:39Z","lastTransitionTime":"2025-09-30T13:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.550932 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.550970 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.550985 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.551003 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.551016 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:39Z","lastTransitionTime":"2025-09-30T13:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.569997 4936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.654912 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.654993 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.655011 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.655040 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.655059 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:39Z","lastTransitionTime":"2025-09-30T13:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.758942 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.759005 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.759022 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.759046 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.759067 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:39Z","lastTransitionTime":"2025-09-30T13:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.862952 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.863017 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.863041 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.863076 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.863100 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:39Z","lastTransitionTime":"2025-09-30T13:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.965859 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.965957 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.965979 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.966003 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:39 crc kubenswrapper[4936]: I0930 13:39:39.966022 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:39Z","lastTransitionTime":"2025-09-30T13:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.068743 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.068785 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.068797 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.068848 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.068876 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:40Z","lastTransitionTime":"2025-09-30T13:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.137373 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.137410 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.137418 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.137430 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.137439 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:40Z","lastTransitionTime":"2025-09-30T13:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:40 crc kubenswrapper[4936]: E0930 13:39:40.155474 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.159621 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.159649 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.159660 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.159677 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.159688 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:40Z","lastTransitionTime":"2025-09-30T13:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:40 crc kubenswrapper[4936]: E0930 13:39:40.171511 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.175482 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.175516 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.175528 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.175544 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.175556 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:40Z","lastTransitionTime":"2025-09-30T13:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:40 crc kubenswrapper[4936]: E0930 13:39:40.189867 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.193424 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.193465 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.193477 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.193495 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.193505 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:40Z","lastTransitionTime":"2025-09-30T13:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:40 crc kubenswrapper[4936]: E0930 13:39:40.206036 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.209828 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.209866 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.209880 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.209900 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.209917 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:40Z","lastTransitionTime":"2025-09-30T13:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:40 crc kubenswrapper[4936]: E0930 13:39:40.223127 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:40 crc kubenswrapper[4936]: E0930 13:39:40.223287 4936 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.224883 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.224958 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.224971 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.224986 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.224998 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:40Z","lastTransitionTime":"2025-09-30T13:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.333306 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.333790 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.333828 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.333846 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.333869 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.333884 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:40Z","lastTransitionTime":"2025-09-30T13:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.359757 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.381041 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.400824 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.434378 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.438665 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.438701 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.438711 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.438726 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.438736 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:40Z","lastTransitionTime":"2025-09-30T13:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.453068 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.472673 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.494166 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.511663 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.540802 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.540844 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.540860 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.540877 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.540889 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:40Z","lastTransitionTime":"2025-09-30T13:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.547585 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.565702 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.572112 4936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.580309 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.591572 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.608220 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a09f85fd216b9dbfe662a23570d83cda52f735c9997f1ad745e5cbf5fec5f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.642878 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.642909 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.642919 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.642933 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.642943 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:40Z","lastTransitionTime":"2025-09-30T13:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.745063 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.745134 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.745145 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.745165 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.745178 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:40Z","lastTransitionTime":"2025-09-30T13:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.847498 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.847547 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.847559 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.847579 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.847592 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:40Z","lastTransitionTime":"2025-09-30T13:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.950854 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.950895 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.950903 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.950919 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:40 crc kubenswrapper[4936]: I0930 13:39:40.950928 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:40Z","lastTransitionTime":"2025-09-30T13:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.053634 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.053678 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.053688 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.053706 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.053716 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:41Z","lastTransitionTime":"2025-09-30T13:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.156284 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.156329 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.156358 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.156375 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.156385 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:41Z","lastTransitionTime":"2025-09-30T13:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.259710 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.259786 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.259799 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.259822 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.259843 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:41Z","lastTransitionTime":"2025-09-30T13:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.314512 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.314522 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.314537 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:41 crc kubenswrapper[4936]: E0930 13:39:41.314949 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:39:41 crc kubenswrapper[4936]: E0930 13:39:41.315002 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:39:41 crc kubenswrapper[4936]: E0930 13:39:41.314710 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.361921 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.361961 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.361972 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.361988 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.361998 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:41Z","lastTransitionTime":"2025-09-30T13:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.471288 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.471730 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.471858 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.471978 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.472092 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:41Z","lastTransitionTime":"2025-09-30T13:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.576120 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.576166 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.576178 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.576198 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.576214 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:41Z","lastTransitionTime":"2025-09-30T13:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.577993 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vnws_166715eb-a672-4111-b64e-626a0f7b0d74/ovnkube-controller/0.log" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.580942 4936 generic.go:334] "Generic (PLEG): container finished" podID="166715eb-a672-4111-b64e-626a0f7b0d74" containerID="7a09f85fd216b9dbfe662a23570d83cda52f735c9997f1ad745e5cbf5fec5f89" exitCode=1 Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.580999 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerDied","Data":"7a09f85fd216b9dbfe662a23570d83cda52f735c9997f1ad745e5cbf5fec5f89"} Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.581687 4936 scope.go:117] "RemoveContainer" containerID="7a09f85fd216b9dbfe662a23570d83cda52f735c9997f1ad745e5cbf5fec5f89" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.598656 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.613771 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.630785 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.647122 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.664036 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.678577 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.681749 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.681779 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.681791 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.681808 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.681819 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:41Z","lastTransitionTime":"2025-09-30T13:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.701025 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a09f85fd216b9dbfe662a23570d83cda52f735c9997f1ad745e5cbf5fec5f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a09f85fd216b9dbfe662a23570d83cda52f735c9997f1ad745e5cbf5fec5f89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:41Z\\\",\\\"message\\\":\\\"35805 6136 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:39:41.135830 6136 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:39:41.135847 6136 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 13:39:41.135859 6136 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 13:39:41.135879 6136 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:39:41.135898 6136 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 13:39:41.135918 6136 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:39:41.136049 6136 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 13:39:41.136185 6136 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136232 6136 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136432 6136 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136749 6136 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136771 6136 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.137210 6136 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.716607 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.733953 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.749958 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.765623 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.782862 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.784126 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.784177 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.784192 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.784214 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.784228 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:41Z","lastTransitionTime":"2025-09-30T13:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.799152 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.814537 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.887539 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.887606 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.887622 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.887647 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.887665 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:41Z","lastTransitionTime":"2025-09-30T13:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.990652 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.990732 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.990758 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.990790 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:41 crc kubenswrapper[4936]: I0930 13:39:41.990812 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:41Z","lastTransitionTime":"2025-09-30T13:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.093249 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.093308 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.093325 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.093384 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.093405 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:42Z","lastTransitionTime":"2025-09-30T13:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.195388 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.195431 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.195439 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.195460 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.195472 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:42Z","lastTransitionTime":"2025-09-30T13:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.298706 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.298748 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.298756 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.298772 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.298782 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:42Z","lastTransitionTime":"2025-09-30T13:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.401013 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.401055 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.401065 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.401081 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.401090 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:42Z","lastTransitionTime":"2025-09-30T13:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.503700 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.503751 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.503768 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.503791 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.503810 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:42Z","lastTransitionTime":"2025-09-30T13:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.586286 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vnws_166715eb-a672-4111-b64e-626a0f7b0d74/ovnkube-controller/0.log" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.600772 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerStarted","Data":"d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb"} Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.600964 4936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.605516 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.605545 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.605555 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.605569 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.605578 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:42Z","lastTransitionTime":"2025-09-30T13:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.620180 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.636820 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.649972 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.660510 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.676557 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.690255 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.706663 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.708485 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.708539 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.708549 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.708565 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.708602 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:42Z","lastTransitionTime":"2025-09-30T13:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.718904 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.737529 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.752031 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.766475 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.781301 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.797392 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.811439 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.811498 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.811510 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.811524 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.811532 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:42Z","lastTransitionTime":"2025-09-30T13:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.816439 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a09f85fd216b9dbfe662a23570d83cda52f735c9997f1ad745e5cbf5fec5f89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:41Z\\\",\\\"message\\\":\\\"35805 6136 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:39:41.135830 6136 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:39:41.135847 6136 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 13:39:41.135859 6136 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 13:39:41.135879 6136 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:39:41.135898 6136 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 13:39:41.135918 6136 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:39:41.136049 6136 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 13:39:41.136185 6136 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136232 6136 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136432 6136 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136749 6136 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136771 6136 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.137210 6136 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.916982 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.917049 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.917067 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.917094 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:42 crc kubenswrapper[4936]: I0930 13:39:42.917113 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:42Z","lastTransitionTime":"2025-09-30T13:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.019725 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.019790 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.019803 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.019826 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.019839 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:43Z","lastTransitionTime":"2025-09-30T13:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.122564 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.122623 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.122639 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.122661 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.122678 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:43Z","lastTransitionTime":"2025-09-30T13:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.225752 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.226082 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.226196 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.226373 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.226503 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:43Z","lastTransitionTime":"2025-09-30T13:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.304185 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6"] Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.304790 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.307545 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.307896 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.314288 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:43 crc kubenswrapper[4936]: E0930 13:39:43.314460 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.314294 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:43 crc kubenswrapper[4936]: E0930 13:39:43.314565 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.314295 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:43 crc kubenswrapper[4936]: E0930 13:39:43.314636 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.325440 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.329115 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.329160 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.329174 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.329194 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.329210 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:43Z","lastTransitionTime":"2025-09-30T13:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.339869 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.364517 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a09f85fd216b9dbfe662a23570d83cda52f735c9997f1ad745e5cbf5fec5f89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:41Z\\\",\\\"message\\\":\\\"35805 6136 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:39:41.135830 6136 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:39:41.135847 6136 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 13:39:41.135859 6136 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 13:39:41.135879 6136 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:39:41.135898 6136 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 13:39:41.135918 6136 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:39:41.136049 6136 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 13:39:41.136185 6136 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136232 6136 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136432 6136 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136749 6136 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136771 6136 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.137210 6136 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.382703 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.399302 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.399634 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f97225f9-be97-4bc9-841b-fc96e4a8be4d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xwqb6\" (UID: \"f97225f9-be97-4bc9-841b-fc96e4a8be4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.399787 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f97225f9-be97-4bc9-841b-fc96e4a8be4d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xwqb6\" (UID: \"f97225f9-be97-4bc9-841b-fc96e4a8be4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.399838 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxcgz\" (UniqueName: \"kubernetes.io/projected/f97225f9-be97-4bc9-841b-fc96e4a8be4d-kube-api-access-zxcgz\") pod \"ovnkube-control-plane-749d76644c-xwqb6\" (UID: \"f97225f9-be97-4bc9-841b-fc96e4a8be4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.400015 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f97225f9-be97-4bc9-841b-fc96e4a8be4d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xwqb6\" (UID: \"f97225f9-be97-4bc9-841b-fc96e4a8be4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.413226 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.426193 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.432140 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.432198 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.432210 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.432260 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.432272 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:43Z","lastTransitionTime":"2025-09-30T13:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.440108 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.456290 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.467910 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.484981 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.494712 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.501459 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f97225f9-be97-4bc9-841b-fc96e4a8be4d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xwqb6\" (UID: \"f97225f9-be97-4bc9-841b-fc96e4a8be4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.501631 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f97225f9-be97-4bc9-841b-fc96e4a8be4d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xwqb6\" (UID: \"f97225f9-be97-4bc9-841b-fc96e4a8be4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.501726 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f97225f9-be97-4bc9-841b-fc96e4a8be4d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xwqb6\" (UID: \"f97225f9-be97-4bc9-841b-fc96e4a8be4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.501811 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxcgz\" (UniqueName: \"kubernetes.io/projected/f97225f9-be97-4bc9-841b-fc96e4a8be4d-kube-api-access-zxcgz\") pod \"ovnkube-control-plane-749d76644c-xwqb6\" (UID: \"f97225f9-be97-4bc9-841b-fc96e4a8be4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.502626 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f97225f9-be97-4bc9-841b-fc96e4a8be4d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xwqb6\" (UID: \"f97225f9-be97-4bc9-841b-fc96e4a8be4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.502702 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f97225f9-be97-4bc9-841b-fc96e4a8be4d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xwqb6\" (UID: \"f97225f9-be97-4bc9-841b-fc96e4a8be4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.506786 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f97225f9-be97-4bc9-841b-fc96e4a8be4d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xwqb6\" (UID: \"f97225f9-be97-4bc9-841b-fc96e4a8be4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.509099 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.520052 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxcgz\" (UniqueName: \"kubernetes.io/projected/f97225f9-be97-4bc9-841b-fc96e4a8be4d-kube-api-access-zxcgz\") pod \"ovnkube-control-plane-749d76644c-xwqb6\" (UID: \"f97225f9-be97-4bc9-841b-fc96e4a8be4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.523956 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.534391 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.534527 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.534587 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.534740 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.534809 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:43Z","lastTransitionTime":"2025-09-30T13:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.535287 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.606573 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vnws_166715eb-a672-4111-b64e-626a0f7b0d74/ovnkube-controller/1.log" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.607204 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vnws_166715eb-a672-4111-b64e-626a0f7b0d74/ovnkube-controller/0.log" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.610446 4936 generic.go:334] "Generic (PLEG): container finished" podID="166715eb-a672-4111-b64e-626a0f7b0d74" containerID="d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb" exitCode=1 Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.610504 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerDied","Data":"d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb"} Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.610552 4936 scope.go:117] "RemoveContainer" containerID="7a09f85fd216b9dbfe662a23570d83cda52f735c9997f1ad745e5cbf5fec5f89" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.612331 4936 scope.go:117] "RemoveContainer" containerID="d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb" Sep 30 13:39:43 crc kubenswrapper[4936]: E0930 13:39:43.614262 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.625782 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.629016 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.637545 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.637593 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.637606 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.637627 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.637639 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:43Z","lastTransitionTime":"2025-09-30T13:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:43 crc kubenswrapper[4936]: W0930 13:39:43.643973 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf97225f9_be97_4bc9_841b_fc96e4a8be4d.slice/crio-47cb91abcbc15101f913c6bd115ff873be8988565e168dab7fbed6c15a36676b WatchSource:0}: Error finding container 47cb91abcbc15101f913c6bd115ff873be8988565e168dab7fbed6c15a36676b: Status 404 returned error can't find the container with id 47cb91abcbc15101f913c6bd115ff873be8988565e168dab7fbed6c15a36676b Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.654233 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.674309 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.695674 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.716108 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.732441 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.741225 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.741273 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.741285 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.741304 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.741316 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:43Z","lastTransitionTime":"2025-09-30T13:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.752551 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.765980 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.783689 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a09f85fd216b9dbfe662a23570d83cda52f735c9997f1ad745e5cbf5fec5f89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:41Z\\\",\\\"message\\\":\\\"35805 6136 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:39:41.135830 6136 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:39:41.135847 6136 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 13:39:41.135859 6136 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 13:39:41.135879 6136 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:39:41.135898 6136 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 13:39:41.135918 6136 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:39:41.136049 6136 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 13:39:41.136185 6136 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136232 6136 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136432 6136 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136749 6136 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136771 6136 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.137210 6136 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:42Z\\\",\\\"message\\\":\\\"y.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 13:39:42.573473 6256 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-vxjrh in node crc\\\\nI0930 13:39:42.573729 6256 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0930 13:39:42.573658 6256 services_controller.go:434] Service openshift-apiserver/api retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{api openshift-apiserver 3b54abf8-b632-44a4-b36d-9f489b41a2d2 4787 0 2025-02-23 05:22:52 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-apiserver] map[operator.openshift.io/spec-hash:9c74227d7f96d723d980c50373a5e91f08c5893365bfd5a5040449b1b6585a23 service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{apiserver: true,},ClusterIP:10.217.5.37,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNode\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.794795 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.804756 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.814509 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.823031 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.832412 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.843412 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.846268 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.846309 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.846320 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.846352 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.846365 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:43Z","lastTransitionTime":"2025-09-30T13:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.948262 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.948291 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.948301 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.948318 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:43 crc kubenswrapper[4936]: I0930 13:39:43.948327 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:43Z","lastTransitionTime":"2025-09-30T13:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.049968 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.050013 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.050022 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.050038 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.050047 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:44Z","lastTransitionTime":"2025-09-30T13:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.062352 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-2v46m"] Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.062813 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:39:44 crc kubenswrapper[4936]: E0930 13:39:44.062874 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.074563 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.085964 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.096449 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.106925 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.122431 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.132242 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.142914 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.152144 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.152188 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.152198 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.152216 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.152226 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:44Z","lastTransitionTime":"2025-09-30T13:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.152484 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.165170 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.178282 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.190121 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.202697 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.206988 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxrg7\" (UniqueName: \"kubernetes.io/projected/e3bd8048-3efa-41ed-a7ff-8d477db72be7-kube-api-access-hxrg7\") pod \"network-metrics-daemon-2v46m\" (UID: \"e3bd8048-3efa-41ed-a7ff-8d477db72be7\") " pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.207050 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs\") pod \"network-metrics-daemon-2v46m\" (UID: \"e3bd8048-3efa-41ed-a7ff-8d477db72be7\") " pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.215435 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.225400 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.244463 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a09f85fd216b9dbfe662a23570d83cda52f735c9997f1ad745e5cbf5fec5f89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:41Z\\\",\\\"message\\\":\\\"35805 6136 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:39:41.135830 6136 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:39:41.135847 6136 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 13:39:41.135859 6136 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 13:39:41.135879 6136 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:39:41.135898 6136 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 13:39:41.135918 6136 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:39:41.136049 6136 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 13:39:41.136185 6136 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136232 6136 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136432 6136 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136749 6136 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136771 6136 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.137210 6136 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:42Z\\\",\\\"message\\\":\\\"y.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 13:39:42.573473 6256 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-vxjrh in node crc\\\\nI0930 13:39:42.573729 6256 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0930 13:39:42.573658 6256 services_controller.go:434] Service openshift-apiserver/api retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{api openshift-apiserver 3b54abf8-b632-44a4-b36d-9f489b41a2d2 4787 0 2025-02-23 05:22:52 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-apiserver] map[operator.openshift.io/spec-hash:9c74227d7f96d723d980c50373a5e91f08c5893365bfd5a5040449b1b6585a23 service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{apiserver: true,},ClusterIP:10.217.5.37,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNode\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.254888 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.254963 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.254981 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.255508 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.255569 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:44Z","lastTransitionTime":"2025-09-30T13:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.255963 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.308449 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs\") pod \"network-metrics-daemon-2v46m\" (UID: \"e3bd8048-3efa-41ed-a7ff-8d477db72be7\") " pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.308508 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxrg7\" (UniqueName: \"kubernetes.io/projected/e3bd8048-3efa-41ed-a7ff-8d477db72be7-kube-api-access-hxrg7\") pod \"network-metrics-daemon-2v46m\" (UID: \"e3bd8048-3efa-41ed-a7ff-8d477db72be7\") " pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:39:44 crc kubenswrapper[4936]: E0930 13:39:44.308734 4936 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:39:44 crc kubenswrapper[4936]: E0930 13:39:44.308854 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs podName:e3bd8048-3efa-41ed-a7ff-8d477db72be7 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:44.808821893 +0000 UTC m=+35.192824224 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs") pod "network-metrics-daemon-2v46m" (UID: "e3bd8048-3efa-41ed-a7ff-8d477db72be7") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.324074 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxrg7\" (UniqueName: \"kubernetes.io/projected/e3bd8048-3efa-41ed-a7ff-8d477db72be7-kube-api-access-hxrg7\") pod \"network-metrics-daemon-2v46m\" (UID: \"e3bd8048-3efa-41ed-a7ff-8d477db72be7\") " pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.358944 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.358981 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.358990 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.359006 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.359016 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:44Z","lastTransitionTime":"2025-09-30T13:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.461566 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.461638 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.461662 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.461702 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.461726 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:44Z","lastTransitionTime":"2025-09-30T13:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.564375 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.564446 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.564467 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.564498 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.564520 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:44Z","lastTransitionTime":"2025-09-30T13:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.616016 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vnws_166715eb-a672-4111-b64e-626a0f7b0d74/ovnkube-controller/1.log" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.621398 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" event={"ID":"f97225f9-be97-4bc9-841b-fc96e4a8be4d","Type":"ContainerStarted","Data":"f4afd1351049d54e6fc8464c7c3b32bf91337aadd1a1a8832eb9b8d72ff9da06"} Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.621455 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" event={"ID":"f97225f9-be97-4bc9-841b-fc96e4a8be4d","Type":"ContainerStarted","Data":"2c637b9dda2a085daa132f4c1e90448e0c4754cd6e62ce335c271f23c993f77a"} Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.621471 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" event={"ID":"f97225f9-be97-4bc9-841b-fc96e4a8be4d","Type":"ContainerStarted","Data":"47cb91abcbc15101f913c6bd115ff873be8988565e168dab7fbed6c15a36676b"} Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.640527 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.655845 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.666777 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.666818 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.666830 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.666845 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.666855 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:44Z","lastTransitionTime":"2025-09-30T13:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.669273 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c637b9dda2a085daa132f4c1e90448e0c4754cd6e62ce335c271f23c993f77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4afd1351049d54e6fc8464c7c3b32bf91337aadd1a1a8832eb9b8d72ff9da06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.685512 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.698521 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.716795 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a09f85fd216b9dbfe662a23570d83cda52f735c9997f1ad745e5cbf5fec5f89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:41Z\\\",\\\"message\\\":\\\"35805 6136 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:39:41.135830 6136 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:39:41.135847 6136 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 13:39:41.135859 6136 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 13:39:41.135879 6136 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:39:41.135898 6136 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 13:39:41.135918 6136 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:39:41.136049 6136 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 13:39:41.136185 6136 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136232 6136 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136432 6136 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136749 6136 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136771 6136 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.137210 6136 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:42Z\\\",\\\"message\\\":\\\"y.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 13:39:42.573473 6256 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-vxjrh in node crc\\\\nI0930 13:39:42.573729 6256 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0930 13:39:42.573658 6256 services_controller.go:434] Service openshift-apiserver/api retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{api openshift-apiserver 3b54abf8-b632-44a4-b36d-9f489b41a2d2 4787 0 2025-02-23 05:22:52 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-apiserver] map[operator.openshift.io/spec-hash:9c74227d7f96d723d980c50373a5e91f08c5893365bfd5a5040449b1b6585a23 service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{apiserver: true,},ClusterIP:10.217.5.37,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNode\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.727391 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.741173 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.755535 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.765869 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.768900 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.768933 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.768947 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.768964 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.768974 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:44Z","lastTransitionTime":"2025-09-30T13:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.777379 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.786197 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.798053 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.808712 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.814595 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs\") pod \"network-metrics-daemon-2v46m\" (UID: \"e3bd8048-3efa-41ed-a7ff-8d477db72be7\") " pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:39:44 crc kubenswrapper[4936]: E0930 13:39:44.814800 4936 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:39:44 crc kubenswrapper[4936]: E0930 13:39:44.814880 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs podName:e3bd8048-3efa-41ed-a7ff-8d477db72be7 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:45.814861203 +0000 UTC m=+36.198863504 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs") pod "network-metrics-daemon-2v46m" (UID: "e3bd8048-3efa-41ed-a7ff-8d477db72be7") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.819299 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.828884 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.872127 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.872191 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.872204 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.872227 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.872243 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:44Z","lastTransitionTime":"2025-09-30T13:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.974525 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.974587 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.974601 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.974617 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:44 crc kubenswrapper[4936]: I0930 13:39:44.974627 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:44Z","lastTransitionTime":"2025-09-30T13:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.077807 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.077857 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.077868 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.077888 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.077900 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:45Z","lastTransitionTime":"2025-09-30T13:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.180605 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.180703 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.180722 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.180745 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.180774 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:45Z","lastTransitionTime":"2025-09-30T13:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.284029 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.284081 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.284093 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.284113 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.284158 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:45Z","lastTransitionTime":"2025-09-30T13:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.314318 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.314404 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.314470 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:45 crc kubenswrapper[4936]: E0930 13:39:45.314465 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:39:45 crc kubenswrapper[4936]: E0930 13:39:45.314598 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:39:45 crc kubenswrapper[4936]: E0930 13:39:45.314734 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.314855 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:45 crc kubenswrapper[4936]: E0930 13:39:45.314933 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.386723 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.386763 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.386776 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.386794 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.386806 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:45Z","lastTransitionTime":"2025-09-30T13:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.489181 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.489218 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.489230 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.489250 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.489263 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:45Z","lastTransitionTime":"2025-09-30T13:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.592005 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.592072 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.592095 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.592124 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.592147 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:45Z","lastTransitionTime":"2025-09-30T13:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.695704 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.695770 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.695789 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.695814 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.695831 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:45Z","lastTransitionTime":"2025-09-30T13:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.799572 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.799963 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.800128 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.800315 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.800552 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:45Z","lastTransitionTime":"2025-09-30T13:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.826694 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs\") pod \"network-metrics-daemon-2v46m\" (UID: \"e3bd8048-3efa-41ed-a7ff-8d477db72be7\") " pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:39:45 crc kubenswrapper[4936]: E0930 13:39:45.826916 4936 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:39:45 crc kubenswrapper[4936]: E0930 13:39:45.827014 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs podName:e3bd8048-3efa-41ed-a7ff-8d477db72be7 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:47.826984364 +0000 UTC m=+38.210986705 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs") pod "network-metrics-daemon-2v46m" (UID: "e3bd8048-3efa-41ed-a7ff-8d477db72be7") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.903901 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.904245 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.904504 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.904703 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:45 crc kubenswrapper[4936]: I0930 13:39:45.904873 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:45Z","lastTransitionTime":"2025-09-30T13:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.007778 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.007830 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.007849 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.007871 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.007883 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:46Z","lastTransitionTime":"2025-09-30T13:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.111398 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.111446 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.111457 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.111474 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.111484 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:46Z","lastTransitionTime":"2025-09-30T13:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.213537 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.213591 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.213603 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.213623 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.213635 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:46Z","lastTransitionTime":"2025-09-30T13:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.316182 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.316243 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.316253 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.316267 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.316276 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:46Z","lastTransitionTime":"2025-09-30T13:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.418317 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.418382 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.418393 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.418407 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.418602 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:46Z","lastTransitionTime":"2025-09-30T13:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.520944 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.521001 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.521026 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.521064 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.521082 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:46Z","lastTransitionTime":"2025-09-30T13:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.623902 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.624018 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.624035 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.624076 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.624091 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:46Z","lastTransitionTime":"2025-09-30T13:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.726658 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.726722 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.726739 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.726770 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.726793 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:46Z","lastTransitionTime":"2025-09-30T13:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.829861 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.829924 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.829946 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.829974 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.829995 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:46Z","lastTransitionTime":"2025-09-30T13:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.836790 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.836873 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.836957 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:46 crc kubenswrapper[4936]: E0930 13:39:46.837002 4936 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.837020 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:46 crc kubenswrapper[4936]: E0930 13:39:46.837098 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:40:02.83706585 +0000 UTC m=+53.221068181 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:39:46 crc kubenswrapper[4936]: E0930 13:39:46.837598 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:39:46 crc kubenswrapper[4936]: E0930 13:39:46.837638 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:39:46 crc kubenswrapper[4936]: E0930 13:39:46.837684 4936 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:46 crc kubenswrapper[4936]: E0930 13:39:46.837867 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:40:02.837738079 +0000 UTC m=+53.221740420 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:46 crc kubenswrapper[4936]: E0930 13:39:46.837952 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:39:46 crc kubenswrapper[4936]: E0930 13:39:46.837984 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:39:46 crc kubenswrapper[4936]: E0930 13:39:46.838004 4936 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:46 crc kubenswrapper[4936]: E0930 13:39:46.838075 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:40:02.838054097 +0000 UTC m=+53.222056428 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:39:46 crc kubenswrapper[4936]: E0930 13:39:46.838088 4936 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:39:46 crc kubenswrapper[4936]: E0930 13:39:46.838157 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:40:02.838130029 +0000 UTC m=+53.222132370 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.932925 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.932999 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.933021 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.933048 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.933066 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:46Z","lastTransitionTime":"2025-09-30T13:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:46 crc kubenswrapper[4936]: I0930 13:39:46.938249 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:39:46 crc kubenswrapper[4936]: E0930 13:39:46.938477 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:40:02.938455359 +0000 UTC m=+53.322457670 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.035373 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.035450 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.035474 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.035507 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.035576 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:47Z","lastTransitionTime":"2025-09-30T13:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.138952 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.139060 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.139084 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.139147 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.139171 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:47Z","lastTransitionTime":"2025-09-30T13:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.242263 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.242317 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.242360 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.242410 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.242426 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:47Z","lastTransitionTime":"2025-09-30T13:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.314999 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.315083 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.315653 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:47 crc kubenswrapper[4936]: E0930 13:39:47.315826 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.315756 4936 scope.go:117] "RemoveContainer" containerID="32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.315722 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:47 crc kubenswrapper[4936]: E0930 13:39:47.316406 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:39:47 crc kubenswrapper[4936]: E0930 13:39:47.316702 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:39:47 crc kubenswrapper[4936]: E0930 13:39:47.316811 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.349724 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.349759 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.349771 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.349788 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.349802 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:47Z","lastTransitionTime":"2025-09-30T13:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.452277 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.452310 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.452321 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.452354 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.452367 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:47Z","lastTransitionTime":"2025-09-30T13:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.555732 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.555774 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.555783 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.555799 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.555808 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:47Z","lastTransitionTime":"2025-09-30T13:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.634412 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.636598 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f6625580c07bb67ac50a6ffa9e19fff88bb9602760eda24b0affd6579717c7ea"} Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.637114 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.650318 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.658144 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.658195 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.658207 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.658223 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.658235 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:47Z","lastTransitionTime":"2025-09-30T13:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.660409 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.670194 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.684954 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.695996 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c637b9dda2a085daa132f4c1e90448e0c4754cd6e62ce335c271f23c993f77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4afd1351049d54e6fc8464c7c3b32bf91337aadd1a1a8832eb9b8d72ff9da06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.707685 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.719434 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.737691 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a09f85fd216b9dbfe662a23570d83cda52f735c9997f1ad745e5cbf5fec5f89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:41Z\\\",\\\"message\\\":\\\"35805 6136 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:39:41.135830 6136 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:39:41.135847 6136 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 13:39:41.135859 6136 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 13:39:41.135879 6136 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:39:41.135898 6136 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 13:39:41.135918 6136 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:39:41.136049 6136 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 13:39:41.136185 6136 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136232 6136 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136432 6136 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136749 6136 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136771 6136 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.137210 6136 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:42Z\\\",\\\"message\\\":\\\"y.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 13:39:42.573473 6256 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-vxjrh in node crc\\\\nI0930 13:39:42.573729 6256 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0930 13:39:42.573658 6256 services_controller.go:434] Service openshift-apiserver/api retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{api openshift-apiserver 3b54abf8-b632-44a4-b36d-9f489b41a2d2 4787 0 2025-02-23 05:22:52 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-apiserver] map[operator.openshift.io/spec-hash:9c74227d7f96d723d980c50373a5e91f08c5893365bfd5a5040449b1b6585a23 service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{apiserver: true,},ClusterIP:10.217.5.37,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNode\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.749914 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.760865 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.760904 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.760915 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.760936 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.760948 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:47Z","lastTransitionTime":"2025-09-30T13:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.762615 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6625580c07bb67ac50a6ffa9e19fff88bb9602760eda24b0affd6579717c7ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.774632 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.786481 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.797376 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.807147 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.821735 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.833703 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.849208 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs\") pod \"network-metrics-daemon-2v46m\" (UID: \"e3bd8048-3efa-41ed-a7ff-8d477db72be7\") " pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:39:47 crc kubenswrapper[4936]: E0930 13:39:47.849391 4936 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:39:47 crc kubenswrapper[4936]: E0930 13:39:47.849487 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs podName:e3bd8048-3efa-41ed-a7ff-8d477db72be7 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:51.849464288 +0000 UTC m=+42.233466609 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs") pod "network-metrics-daemon-2v46m" (UID: "e3bd8048-3efa-41ed-a7ff-8d477db72be7") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.863463 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.863508 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.863535 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.863555 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.863567 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:47Z","lastTransitionTime":"2025-09-30T13:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.966816 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.966857 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.966869 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.966886 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:47 crc kubenswrapper[4936]: I0930 13:39:47.966897 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:47Z","lastTransitionTime":"2025-09-30T13:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.069690 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.069722 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.069731 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.069746 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.069755 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:48Z","lastTransitionTime":"2025-09-30T13:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.172174 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.172218 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.172229 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.172246 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.172258 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:48Z","lastTransitionTime":"2025-09-30T13:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.274387 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.274680 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.274761 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.274824 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.274879 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:48Z","lastTransitionTime":"2025-09-30T13:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.377810 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.377869 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.377879 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.377893 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.377902 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:48Z","lastTransitionTime":"2025-09-30T13:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.480595 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.480663 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.480685 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.480718 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.480740 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:48Z","lastTransitionTime":"2025-09-30T13:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.582501 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.582549 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.582560 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.582579 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.582591 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:48Z","lastTransitionTime":"2025-09-30T13:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.684616 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.684664 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.684673 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.684689 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.684699 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:48Z","lastTransitionTime":"2025-09-30T13:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.787527 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.787567 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.787577 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.787593 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.787603 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:48Z","lastTransitionTime":"2025-09-30T13:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.889645 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.889687 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.889699 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.889717 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.889730 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:48Z","lastTransitionTime":"2025-09-30T13:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.991962 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.992187 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.992261 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.992380 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:48 crc kubenswrapper[4936]: I0930 13:39:48.992443 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:48Z","lastTransitionTime":"2025-09-30T13:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.094322 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.094407 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.094421 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.094439 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.094451 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:49Z","lastTransitionTime":"2025-09-30T13:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.197103 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.197166 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.197189 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.197219 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.197242 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:49Z","lastTransitionTime":"2025-09-30T13:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.300435 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.300472 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.300482 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.300499 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.300511 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:49Z","lastTransitionTime":"2025-09-30T13:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.315080 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.315123 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.315129 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.315282 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:49 crc kubenswrapper[4936]: E0930 13:39:49.315488 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:39:49 crc kubenswrapper[4936]: E0930 13:39:49.315618 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:39:49 crc kubenswrapper[4936]: E0930 13:39:49.315745 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:39:49 crc kubenswrapper[4936]: E0930 13:39:49.315871 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.403672 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.403891 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.403962 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.404063 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.404163 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:49Z","lastTransitionTime":"2025-09-30T13:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.506612 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.506646 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.506660 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.506678 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.506689 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:49Z","lastTransitionTime":"2025-09-30T13:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.608538 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.608602 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.608616 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.608637 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.608648 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:49Z","lastTransitionTime":"2025-09-30T13:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.710894 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.710947 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.710960 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.710977 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.710989 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:49Z","lastTransitionTime":"2025-09-30T13:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.813118 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.813230 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.813251 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.813275 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.813292 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:49Z","lastTransitionTime":"2025-09-30T13:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.915842 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.915888 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.915902 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.915919 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:49 crc kubenswrapper[4936]: I0930 13:39:49.915932 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:49Z","lastTransitionTime":"2025-09-30T13:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.018962 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.019005 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.019017 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.019032 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.019044 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:50Z","lastTransitionTime":"2025-09-30T13:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.121526 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.121806 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.121906 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.121992 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.122061 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:50Z","lastTransitionTime":"2025-09-30T13:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.224892 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.224930 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.224969 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.224989 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.225001 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:50Z","lastTransitionTime":"2025-09-30T13:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.330460 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.331100 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.331181 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.331208 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.331244 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.331268 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:50Z","lastTransitionTime":"2025-09-30T13:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.345213 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.363072 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.375644 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.394470 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.412667 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.414695 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.414746 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.414768 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.414799 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.414820 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:50Z","lastTransitionTime":"2025-09-30T13:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.434863 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: E0930 13:39:50.435416 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.441781 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.441831 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.441844 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.441862 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.441872 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:50Z","lastTransitionTime":"2025-09-30T13:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.453672 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c637b9dda2a085daa132f4c1e90448e0c4754cd6e62ce335c271f23c993f77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4afd1351049d54e6fc8464c7c3b32bf91337aadd1a1a8832eb9b8d72ff9da06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: E0930 13:39:50.459756 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.470920 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.470977 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.470997 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.471023 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.471041 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:50Z","lastTransitionTime":"2025-09-30T13:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.472699 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.483637 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: E0930 13:39:50.491016 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.495553 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.495697 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.495828 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.495960 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.496085 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:50Z","lastTransitionTime":"2025-09-30T13:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.503506 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: E0930 13:39:50.510547 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.514219 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.514264 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.514276 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.514293 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.514307 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:50Z","lastTransitionTime":"2025-09-30T13:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.515730 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: E0930 13:39:50.530234 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: E0930 13:39:50.530421 4936 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.532123 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.532158 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.532171 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.532190 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.532203 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:50Z","lastTransitionTime":"2025-09-30T13:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.535025 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6625580c07bb67ac50a6ffa9e19fff88bb9602760eda24b0affd6579717c7ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.552018 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.565554 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.582004 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a09f85fd216b9dbfe662a23570d83cda52f735c9997f1ad745e5cbf5fec5f89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:41Z\\\",\\\"message\\\":\\\"35805 6136 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:39:41.135830 6136 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:39:41.135847 6136 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 13:39:41.135859 6136 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 13:39:41.135879 6136 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:39:41.135898 6136 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 13:39:41.135918 6136 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:39:41.136049 6136 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 13:39:41.136185 6136 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136232 6136 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136432 6136 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136749 6136 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.136771 6136 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:39:41.137210 6136 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:42Z\\\",\\\"message\\\":\\\"y.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 13:39:42.573473 6256 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-vxjrh in node crc\\\\nI0930 13:39:42.573729 6256 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0930 13:39:42.573658 6256 services_controller.go:434] Service openshift-apiserver/api retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{api openshift-apiserver 3b54abf8-b632-44a4-b36d-9f489b41a2d2 4787 0 2025-02-23 05:22:52 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-apiserver] map[operator.openshift.io/spec-hash:9c74227d7f96d723d980c50373a5e91f08c5893365bfd5a5040449b1b6585a23 service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{apiserver: true,},ClusterIP:10.217.5.37,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNode\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.634561 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.634604 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.634616 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.634634 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.634646 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:50Z","lastTransitionTime":"2025-09-30T13:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.737726 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.737775 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.737785 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.737804 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.737815 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:50Z","lastTransitionTime":"2025-09-30T13:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.840150 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.840177 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.840206 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.840221 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.840230 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:50Z","lastTransitionTime":"2025-09-30T13:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.943440 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.943495 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.943507 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.943528 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:50 crc kubenswrapper[4936]: I0930 13:39:50.943541 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:50Z","lastTransitionTime":"2025-09-30T13:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.046294 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.046364 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.046374 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.046390 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.046401 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:51Z","lastTransitionTime":"2025-09-30T13:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.150279 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.150326 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.150379 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.150404 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.150421 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:51Z","lastTransitionTime":"2025-09-30T13:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.253286 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.253348 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.253357 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.253372 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.253383 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:51Z","lastTransitionTime":"2025-09-30T13:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.315235 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.315397 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.315504 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:51 crc kubenswrapper[4936]: E0930 13:39:51.315515 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:39:51 crc kubenswrapper[4936]: E0930 13:39:51.315649 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.315716 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:51 crc kubenswrapper[4936]: E0930 13:39:51.315759 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:39:51 crc kubenswrapper[4936]: E0930 13:39:51.315823 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.355730 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.355775 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.355784 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.355799 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.355810 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:51Z","lastTransitionTime":"2025-09-30T13:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.458233 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.458313 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.458372 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.458426 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.458452 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:51Z","lastTransitionTime":"2025-09-30T13:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.561493 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.561561 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.561582 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.561609 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.561627 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:51Z","lastTransitionTime":"2025-09-30T13:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.665400 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.665529 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.665549 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.665574 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.665590 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:51Z","lastTransitionTime":"2025-09-30T13:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.768311 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.768391 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.768403 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.768417 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.768426 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:51Z","lastTransitionTime":"2025-09-30T13:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.872019 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.872088 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.872109 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.872134 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.872189 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:51Z","lastTransitionTime":"2025-09-30T13:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.901820 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs\") pod \"network-metrics-daemon-2v46m\" (UID: \"e3bd8048-3efa-41ed-a7ff-8d477db72be7\") " pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:39:51 crc kubenswrapper[4936]: E0930 13:39:51.902027 4936 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:39:51 crc kubenswrapper[4936]: E0930 13:39:51.902136 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs podName:e3bd8048-3efa-41ed-a7ff-8d477db72be7 nodeName:}" failed. No retries permitted until 2025-09-30 13:39:59.902105526 +0000 UTC m=+50.286107867 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs") pod "network-metrics-daemon-2v46m" (UID: "e3bd8048-3efa-41ed-a7ff-8d477db72be7") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.975038 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.975113 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.975137 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.975169 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:51 crc kubenswrapper[4936]: I0930 13:39:51.975191 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:51Z","lastTransitionTime":"2025-09-30T13:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.077950 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.078003 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.078018 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.078038 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.078055 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:52Z","lastTransitionTime":"2025-09-30T13:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.181469 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.181736 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.181759 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.181790 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.181816 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:52Z","lastTransitionTime":"2025-09-30T13:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.284633 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.284691 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.284700 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.284717 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.284748 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:52Z","lastTransitionTime":"2025-09-30T13:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.386667 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.386754 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.386765 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.386782 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.386793 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:52Z","lastTransitionTime":"2025-09-30T13:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.489138 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.489172 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.489180 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.489194 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.489204 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:52Z","lastTransitionTime":"2025-09-30T13:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.591463 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.591512 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.591521 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.591536 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.591546 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:52Z","lastTransitionTime":"2025-09-30T13:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.694186 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.695073 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.695100 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.695118 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.695131 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:52Z","lastTransitionTime":"2025-09-30T13:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.798150 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.798205 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.798219 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.798240 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.798252 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:52Z","lastTransitionTime":"2025-09-30T13:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.901934 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.901974 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.901984 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.902001 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.902013 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:52Z","lastTransitionTime":"2025-09-30T13:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.988785 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:52 crc kubenswrapper[4936]: I0930 13:39:52.989661 4936 scope.go:117] "RemoveContainer" containerID="d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.004308 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.004908 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.005067 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.005199 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.005408 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.005560 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:53Z","lastTransitionTime":"2025-09-30T13:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.019634 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.030452 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.039755 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.051474 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.062095 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.072442 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.081899 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.093511 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.107940 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.108007 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.108020 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.108039 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.108072 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:53Z","lastTransitionTime":"2025-09-30T13:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.111928 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.126885 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c637b9dda2a085daa132f4c1e90448e0c4754cd6e62ce335c271f23c993f77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4afd1351049d54e6fc8464c7c3b32bf91337aadd1a1a8832eb9b8d72ff9da06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.140670 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6625580c07bb67ac50a6ffa9e19fff88bb9602760eda24b0affd6579717c7ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.153169 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.166022 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.189534 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:42Z\\\",\\\"message\\\":\\\"y.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 13:39:42.573473 6256 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-vxjrh in node crc\\\\nI0930 13:39:42.573729 6256 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0930 13:39:42.573658 6256 services_controller.go:434] Service openshift-apiserver/api retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{api openshift-apiserver 3b54abf8-b632-44a4-b36d-9f489b41a2d2 4787 0 2025-02-23 05:22:52 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-apiserver] map[operator.openshift.io/spec-hash:9c74227d7f96d723d980c50373a5e91f08c5893365bfd5a5040449b1b6585a23 service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{apiserver: true,},ClusterIP:10.217.5.37,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNode\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.201031 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.210427 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.210459 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.210469 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.210484 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.210494 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:53Z","lastTransitionTime":"2025-09-30T13:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.312977 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.313020 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.313030 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.313046 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.313057 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:53Z","lastTransitionTime":"2025-09-30T13:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.314415 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:39:53 crc kubenswrapper[4936]: E0930 13:39:53.314535 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.314545 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.314585 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.314620 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:53 crc kubenswrapper[4936]: E0930 13:39:53.314627 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:39:53 crc kubenswrapper[4936]: E0930 13:39:53.314866 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:39:53 crc kubenswrapper[4936]: E0930 13:39:53.314954 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.415917 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.415965 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.415975 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.415991 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.416003 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:53Z","lastTransitionTime":"2025-09-30T13:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.518643 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.518693 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.518705 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.518723 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.518736 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:53Z","lastTransitionTime":"2025-09-30T13:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.621689 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.621726 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.621734 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.621766 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.621778 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:53Z","lastTransitionTime":"2025-09-30T13:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.657291 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vnws_166715eb-a672-4111-b64e-626a0f7b0d74/ovnkube-controller/1.log" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.659363 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerStarted","Data":"a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae"} Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.660275 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.672255 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.682901 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.694629 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.703374 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.715146 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.724910 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.724951 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.724962 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.724978 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.724987 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:53Z","lastTransitionTime":"2025-09-30T13:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.734697 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.746008 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c637b9dda2a085daa132f4c1e90448e0c4754cd6e62ce335c271f23c993f77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4afd1351049d54e6fc8464c7c3b32bf91337aadd1a1a8832eb9b8d72ff9da06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.761902 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6625580c07bb67ac50a6ffa9e19fff88bb9602760eda24b0affd6579717c7ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.775370 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.785502 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.804771 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:42Z\\\",\\\"message\\\":\\\"y.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 13:39:42.573473 6256 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-vxjrh in node crc\\\\nI0930 13:39:42.573729 6256 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0930 13:39:42.573658 6256 services_controller.go:434] Service openshift-apiserver/api retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{api openshift-apiserver 3b54abf8-b632-44a4-b36d-9f489b41a2d2 4787 0 2025-02-23 05:22:52 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-apiserver] map[operator.openshift.io/spec-hash:9c74227d7f96d723d980c50373a5e91f08c5893365bfd5a5040449b1b6585a23 service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{apiserver: true,},ClusterIP:10.217.5.37,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNode\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.814232 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.826958 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.826996 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.827118 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.827127 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.827140 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.827148 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:53Z","lastTransitionTime":"2025-09-30T13:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.837249 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.846519 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.854348 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.929244 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.929546 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.929609 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.929689 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:53 crc kubenswrapper[4936]: I0930 13:39:53.929766 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:53Z","lastTransitionTime":"2025-09-30T13:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.031668 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.031899 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.032051 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.032219 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.032386 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:54Z","lastTransitionTime":"2025-09-30T13:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.134535 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.134587 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.134604 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.134630 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.134645 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:54Z","lastTransitionTime":"2025-09-30T13:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.237434 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.237850 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.238179 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.238543 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.238725 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:54Z","lastTransitionTime":"2025-09-30T13:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.342161 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.342211 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.342222 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.342242 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.342254 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:54Z","lastTransitionTime":"2025-09-30T13:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.444503 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.444541 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.444552 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.444568 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.444579 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:54Z","lastTransitionTime":"2025-09-30T13:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.547749 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.547783 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.547794 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.547836 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.547847 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:54Z","lastTransitionTime":"2025-09-30T13:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.651479 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.651538 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.651550 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.651569 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.651582 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:54Z","lastTransitionTime":"2025-09-30T13:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.664544 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vnws_166715eb-a672-4111-b64e-626a0f7b0d74/ovnkube-controller/2.log" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.665198 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vnws_166715eb-a672-4111-b64e-626a0f7b0d74/ovnkube-controller/1.log" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.667931 4936 generic.go:334] "Generic (PLEG): container finished" podID="166715eb-a672-4111-b64e-626a0f7b0d74" containerID="a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae" exitCode=1 Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.667999 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerDied","Data":"a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae"} Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.668077 4936 scope.go:117] "RemoveContainer" containerID="d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.668817 4936 scope.go:117] "RemoveContainer" containerID="a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae" Sep 30 13:39:54 crc kubenswrapper[4936]: E0930 13:39:54.669017 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.686791 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.701893 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.721957 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.736707 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c637b9dda2a085daa132f4c1e90448e0c4754cd6e62ce335c271f23c993f77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4afd1351049d54e6fc8464c7c3b32bf91337aadd1a1a8832eb9b8d72ff9da06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.752383 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.754082 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.754131 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.754146 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.754170 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.754192 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:54Z","lastTransitionTime":"2025-09-30T13:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.768049 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.796806 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9e04f1ca831a9a7ce4a37d8f7fb5d9f7c91e73cd4cc3b198362b7f466ef3ccb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:42Z\\\",\\\"message\\\":\\\"y.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 13:39:42.573473 6256 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-vxjrh in node crc\\\\nI0930 13:39:42.573729 6256 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0930 13:39:42.573658 6256 services_controller.go:434] Service openshift-apiserver/api retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{api openshift-apiserver 3b54abf8-b632-44a4-b36d-9f489b41a2d2 4787 0 2025-02-23 05:22:52 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:openshift-apiserver] map[operator.openshift.io/spec-hash:9c74227d7f96d723d980c50373a5e91f08c5893365bfd5a5040449b1b6585a23 service.alpha.openshift.io/serving-cert-secret-name:serving-cert service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{apiserver: true,},ClusterIP:10.217.5.37,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNode\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:53Z\\\",\\\"message\\\":\\\"thCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.250],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0930 13:39:53.763763 6481 obj_retry.go:409] Going to retry *v1.Pod resource setup for 14 objects: [openshift-multus/network-metrics-daemon-2v46m openshift-network-operator/iptables-alerter-4ln5h openshift-dns/node-resolver-5zj44 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-7vnws openshift-image-registry/node-ca-fx6ff openshift-multus/multus-additional-cni-plugins-jzqxn openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6 openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-vxjrh openshift-network-diagnostics/network-check-target-xd92c]\\\\nF0930 13:39:53.763776 6481 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.810201 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.828033 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6625580c07bb67ac50a6ffa9e19fff88bb9602760eda24b0affd6579717c7ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.843652 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.856822 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.856863 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.856888 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.856906 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.856919 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:54Z","lastTransitionTime":"2025-09-30T13:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.859271 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.873248 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.883457 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.895029 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.907309 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.921230 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.960135 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.960176 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.960186 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.960206 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:54 crc kubenswrapper[4936]: I0930 13:39:54.960217 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:54Z","lastTransitionTime":"2025-09-30T13:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.063403 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.063447 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.063456 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.063473 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.063487 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:55Z","lastTransitionTime":"2025-09-30T13:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.166608 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.166644 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.166652 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.166668 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.166678 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:55Z","lastTransitionTime":"2025-09-30T13:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.268996 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.269072 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.269089 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.269113 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.269130 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:55Z","lastTransitionTime":"2025-09-30T13:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.314369 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.314508 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.314510 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:55 crc kubenswrapper[4936]: E0930 13:39:55.314587 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.314730 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:55 crc kubenswrapper[4936]: E0930 13:39:55.314738 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:39:55 crc kubenswrapper[4936]: E0930 13:39:55.314869 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:39:55 crc kubenswrapper[4936]: E0930 13:39:55.315026 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.371878 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.371954 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.371976 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.372005 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.372026 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:55Z","lastTransitionTime":"2025-09-30T13:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.474355 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.474395 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.474405 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.474419 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.474429 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:55Z","lastTransitionTime":"2025-09-30T13:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.577215 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.577637 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.577657 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.577684 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.577703 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:55Z","lastTransitionTime":"2025-09-30T13:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.673770 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vnws_166715eb-a672-4111-b64e-626a0f7b0d74/ovnkube-controller/2.log" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.678773 4936 scope.go:117] "RemoveContainer" containerID="a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae" Sep 30 13:39:55 crc kubenswrapper[4936]: E0930 13:39:55.678970 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.679611 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.679664 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.679676 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.679696 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.679709 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:55Z","lastTransitionTime":"2025-09-30T13:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.697028 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.713645 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.733373 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.750406 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.765627 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.782967 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.783008 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.783019 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.783037 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.783049 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:55Z","lastTransitionTime":"2025-09-30T13:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.787133 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.798703 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c637b9dda2a085daa132f4c1e90448e0c4754cd6e62ce335c271f23c993f77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4afd1351049d54e6fc8464c7c3b32bf91337aadd1a1a8832eb9b8d72ff9da06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.813722 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.826809 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.848482 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:53Z\\\",\\\"message\\\":\\\"thCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.250],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0930 13:39:53.763763 6481 obj_retry.go:409] Going to retry *v1.Pod resource setup for 14 objects: [openshift-multus/network-metrics-daemon-2v46m openshift-network-operator/iptables-alerter-4ln5h openshift-dns/node-resolver-5zj44 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-7vnws openshift-image-registry/node-ca-fx6ff openshift-multus/multus-additional-cni-plugins-jzqxn openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6 openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-vxjrh openshift-network-diagnostics/network-check-target-xd92c]\\\\nF0930 13:39:53.763776 6481 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.860676 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.875945 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6625580c07bb67ac50a6ffa9e19fff88bb9602760eda24b0affd6579717c7ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.884848 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.884910 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.884923 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.884942 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.884954 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:55Z","lastTransitionTime":"2025-09-30T13:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.896285 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.912401 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.926229 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.938553 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:39:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.988373 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.988430 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.988439 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.988461 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:55 crc kubenswrapper[4936]: I0930 13:39:55.988472 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:55Z","lastTransitionTime":"2025-09-30T13:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.090712 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.090761 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.090774 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.090810 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.090822 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:56Z","lastTransitionTime":"2025-09-30T13:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.193217 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.193319 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.193328 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.193360 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.193380 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:56Z","lastTransitionTime":"2025-09-30T13:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.296607 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.296660 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.296670 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.296689 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.296701 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:56Z","lastTransitionTime":"2025-09-30T13:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.400000 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.400086 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.400118 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.400157 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.400224 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:56Z","lastTransitionTime":"2025-09-30T13:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.502594 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.502641 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.502653 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.502672 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.502685 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:56Z","lastTransitionTime":"2025-09-30T13:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.609931 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.610009 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.610030 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.610056 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.610074 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:56Z","lastTransitionTime":"2025-09-30T13:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.712201 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.712237 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.712245 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.712259 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.712267 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:56Z","lastTransitionTime":"2025-09-30T13:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.815089 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.815156 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.815173 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.815199 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.815214 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:56Z","lastTransitionTime":"2025-09-30T13:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.918508 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.918583 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.918607 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.918637 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:56 crc kubenswrapper[4936]: I0930 13:39:56.918659 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:56Z","lastTransitionTime":"2025-09-30T13:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.021124 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.021222 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.021248 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.021288 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.021427 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:57Z","lastTransitionTime":"2025-09-30T13:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.124562 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.124616 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.124632 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.124654 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.124669 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:57Z","lastTransitionTime":"2025-09-30T13:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.227066 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.227127 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.227141 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.227159 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.227171 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:57Z","lastTransitionTime":"2025-09-30T13:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.315288 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.315307 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:57 crc kubenswrapper[4936]: E0930 13:39:57.315555 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.315356 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:39:57 crc kubenswrapper[4936]: E0930 13:39:57.315617 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.315356 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:57 crc kubenswrapper[4936]: E0930 13:39:57.315713 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:39:57 crc kubenswrapper[4936]: E0930 13:39:57.315772 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.328907 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.328978 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.329002 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.329035 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.329064 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:57Z","lastTransitionTime":"2025-09-30T13:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.431496 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.431535 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.431545 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.431562 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.431571 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:57Z","lastTransitionTime":"2025-09-30T13:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.535984 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.536062 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.536077 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.536116 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.536129 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:57Z","lastTransitionTime":"2025-09-30T13:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.639733 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.639801 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.639817 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.639841 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.639856 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:57Z","lastTransitionTime":"2025-09-30T13:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.742953 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.743033 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.743057 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.743092 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.743116 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:57Z","lastTransitionTime":"2025-09-30T13:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.846728 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.846810 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.846835 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.846865 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.846890 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:57Z","lastTransitionTime":"2025-09-30T13:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.950409 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.950482 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.950506 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.950538 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:57 crc kubenswrapper[4936]: I0930 13:39:57.950561 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:57Z","lastTransitionTime":"2025-09-30T13:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.053156 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.053222 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.053233 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.053250 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.053260 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:58Z","lastTransitionTime":"2025-09-30T13:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.156749 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.156830 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.156854 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.156931 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.156957 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:58Z","lastTransitionTime":"2025-09-30T13:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.260265 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.260361 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.260383 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.260410 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.260428 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:58Z","lastTransitionTime":"2025-09-30T13:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.363035 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.363092 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.363110 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.363130 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.363146 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:58Z","lastTransitionTime":"2025-09-30T13:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.467022 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.467096 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.467117 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.467146 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.467168 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:58Z","lastTransitionTime":"2025-09-30T13:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.570778 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.570830 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.570842 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.570862 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.570874 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:58Z","lastTransitionTime":"2025-09-30T13:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.673158 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.673205 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.673216 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.673233 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.673248 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:58Z","lastTransitionTime":"2025-09-30T13:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.776044 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.776117 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.776140 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.776170 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.776188 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:58Z","lastTransitionTime":"2025-09-30T13:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.878736 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.879410 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.879553 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.879673 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.879786 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:58Z","lastTransitionTime":"2025-09-30T13:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.982503 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.982551 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.982560 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.982576 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:58 crc kubenswrapper[4936]: I0930 13:39:58.982586 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:58Z","lastTransitionTime":"2025-09-30T13:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.085366 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.085410 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.085423 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.085438 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.085448 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:59Z","lastTransitionTime":"2025-09-30T13:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.188514 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.188576 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.188598 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.188621 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.188637 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:59Z","lastTransitionTime":"2025-09-30T13:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.291434 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.291489 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.291503 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.291529 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.291545 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:59Z","lastTransitionTime":"2025-09-30T13:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.315005 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.315057 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.315010 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.315307 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:39:59 crc kubenswrapper[4936]: E0930 13:39:59.315575 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:39:59 crc kubenswrapper[4936]: E0930 13:39:59.315867 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:39:59 crc kubenswrapper[4936]: E0930 13:39:59.316034 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:39:59 crc kubenswrapper[4936]: E0930 13:39:59.316251 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.395558 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.395915 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.396062 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.396185 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.396328 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:59Z","lastTransitionTime":"2025-09-30T13:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.499926 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.499996 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.500016 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.500041 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.500058 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:59Z","lastTransitionTime":"2025-09-30T13:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.603307 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.603397 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.603410 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.603431 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.603452 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:59Z","lastTransitionTime":"2025-09-30T13:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.706190 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.706261 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.706280 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.706306 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.706324 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:59Z","lastTransitionTime":"2025-09-30T13:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.809431 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.809473 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.809485 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.809504 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.809517 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:59Z","lastTransitionTime":"2025-09-30T13:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.912113 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.912173 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.912185 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.912209 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.912225 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:39:59Z","lastTransitionTime":"2025-09-30T13:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:39:59 crc kubenswrapper[4936]: I0930 13:39:59.988573 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs\") pod \"network-metrics-daemon-2v46m\" (UID: \"e3bd8048-3efa-41ed-a7ff-8d477db72be7\") " pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:39:59 crc kubenswrapper[4936]: E0930 13:39:59.988827 4936 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:39:59 crc kubenswrapper[4936]: E0930 13:39:59.988950 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs podName:e3bd8048-3efa-41ed-a7ff-8d477db72be7 nodeName:}" failed. No retries permitted until 2025-09-30 13:40:15.988912537 +0000 UTC m=+66.372915028 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs") pod "network-metrics-daemon-2v46m" (UID: "e3bd8048-3efa-41ed-a7ff-8d477db72be7") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.016392 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.016463 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.016487 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.016518 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.016541 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:00Z","lastTransitionTime":"2025-09-30T13:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.119585 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.119643 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.119653 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.119674 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.119686 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:00Z","lastTransitionTime":"2025-09-30T13:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.222501 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.222555 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.222573 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.222597 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.222611 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:00Z","lastTransitionTime":"2025-09-30T13:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.325232 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.325280 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.325294 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.325311 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.325324 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:00Z","lastTransitionTime":"2025-09-30T13:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.328055 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.346082 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.362398 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.372522 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.377476 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.384941 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.393249 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.409575 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.426867 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.427505 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.427572 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.427591 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.427619 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.427637 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:00Z","lastTransitionTime":"2025-09-30T13:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.441226 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c637b9dda2a085daa132f4c1e90448e0c4754cd6e62ce335c271f23c993f77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4afd1351049d54e6fc8464c7c3b32bf91337aadd1a1a8832eb9b8d72ff9da06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.455449 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.468451 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.480869 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.491652 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.507423 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6625580c07bb67ac50a6ffa9e19fff88bb9602760eda24b0affd6579717c7ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.521510 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.530038 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.530082 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.530098 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.530123 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.530138 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:00Z","lastTransitionTime":"2025-09-30T13:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.537201 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.557383 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:53Z\\\",\\\"message\\\":\\\"thCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.250],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0930 13:39:53.763763 6481 obj_retry.go:409] Going to retry *v1.Pod resource setup for 14 objects: [openshift-multus/network-metrics-daemon-2v46m openshift-network-operator/iptables-alerter-4ln5h openshift-dns/node-resolver-5zj44 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-7vnws openshift-image-registry/node-ca-fx6ff openshift-multus/multus-additional-cni-plugins-jzqxn openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6 openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-vxjrh openshift-network-diagnostics/network-check-target-xd92c]\\\\nF0930 13:39:53.763776 6481 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.569929 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.585216 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.603265 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.615245 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c637b9dda2a085daa132f4c1e90448e0c4754cd6e62ce335c271f23c993f77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4afd1351049d54e6fc8464c7c3b32bf91337aadd1a1a8832eb9b8d72ff9da06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.629504 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437ce259-4db0-4fa7-add8-5f747c7e7fbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c4ee9d83ec4799fb66b7b47123c77b9dae4dbbd5f06bfda032567297e0939c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08e2b652bd5ea66b08b8186e9e4c204d9bb24e98d561f8410614dcfeaebaac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747d0723207eb1c09444a9cd9f8f52b45b0851c975dfbc81413aafaaa4469fca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.632458 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.632710 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.633062 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.633281 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.633453 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:00Z","lastTransitionTime":"2025-09-30T13:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.643674 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.656658 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.681857 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:53Z\\\",\\\"message\\\":\\\"thCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.250],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0930 13:39:53.763763 6481 obj_retry.go:409] Going to retry *v1.Pod resource setup for 14 objects: [openshift-multus/network-metrics-daemon-2v46m openshift-network-operator/iptables-alerter-4ln5h openshift-dns/node-resolver-5zj44 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-7vnws openshift-image-registry/node-ca-fx6ff openshift-multus/multus-additional-cni-plugins-jzqxn openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6 openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-vxjrh openshift-network-diagnostics/network-check-target-xd92c]\\\\nF0930 13:39:53.763776 6481 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.701689 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.723089 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6625580c07bb67ac50a6ffa9e19fff88bb9602760eda24b0affd6579717c7ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.735943 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.735976 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.735984 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.736000 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.736009 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:00Z","lastTransitionTime":"2025-09-30T13:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.744399 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.762817 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.781917 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.796851 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.813272 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.829915 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.839187 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.839255 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.839272 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.839301 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.839318 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:00Z","lastTransitionTime":"2025-09-30T13:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.849055 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.896685 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.896740 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.896755 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.896776 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.896790 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:00Z","lastTransitionTime":"2025-09-30T13:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:00 crc kubenswrapper[4936]: E0930 13:40:00.912832 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.916857 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.916895 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.916909 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.916933 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.916950 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:00Z","lastTransitionTime":"2025-09-30T13:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:00 crc kubenswrapper[4936]: E0930 13:40:00.929350 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.936866 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.936911 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.936921 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.936940 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.936952 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:00Z","lastTransitionTime":"2025-09-30T13:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:00 crc kubenswrapper[4936]: E0930 13:40:00.949157 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.952974 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.953020 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.953033 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.953051 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.953063 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:00Z","lastTransitionTime":"2025-09-30T13:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:00 crc kubenswrapper[4936]: E0930 13:40:00.965752 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.970213 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.970274 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.970291 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.970317 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.970353 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:00Z","lastTransitionTime":"2025-09-30T13:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:00 crc kubenswrapper[4936]: E0930 13:40:00.982670 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:00Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:00 crc kubenswrapper[4936]: E0930 13:40:00.982885 4936 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.984569 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.984615 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.984626 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.984642 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:00 crc kubenswrapper[4936]: I0930 13:40:00.984654 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:00Z","lastTransitionTime":"2025-09-30T13:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.087324 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.087401 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.087414 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.087435 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.087447 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:01Z","lastTransitionTime":"2025-09-30T13:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.189886 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.189928 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.189966 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.190001 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.190016 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:01Z","lastTransitionTime":"2025-09-30T13:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.292964 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.293007 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.293018 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.293034 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.293044 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:01Z","lastTransitionTime":"2025-09-30T13:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.314680 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.314718 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:01 crc kubenswrapper[4936]: E0930 13:40:01.314808 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.314680 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.314857 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:01 crc kubenswrapper[4936]: E0930 13:40:01.314923 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:01 crc kubenswrapper[4936]: E0930 13:40:01.315015 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:01 crc kubenswrapper[4936]: E0930 13:40:01.315123 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.396209 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.396270 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.396289 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.396314 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.396331 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:01Z","lastTransitionTime":"2025-09-30T13:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.498653 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.498705 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.498723 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.498743 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.498760 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:01Z","lastTransitionTime":"2025-09-30T13:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.602113 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.602223 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.602244 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.602262 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.602273 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:01Z","lastTransitionTime":"2025-09-30T13:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.704252 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.704398 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.704423 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.704452 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.704471 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:01Z","lastTransitionTime":"2025-09-30T13:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.765654 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.785565 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437ce259-4db0-4fa7-add8-5f747c7e7fbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c4ee9d83ec4799fb66b7b47123c77b9dae4dbbd5f06bfda032567297e0939c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08e2b652bd5ea66b08b8186e9e4c204d9bb24e98d561f8410614dcfeaebaac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747d0723207eb1c09444a9cd9f8f52b45b0851c975dfbc81413aafaaa4469fca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.807819 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.807917 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.807980 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.807950 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.808079 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.808225 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:01Z","lastTransitionTime":"2025-09-30T13:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.824533 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.848413 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.873174 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.885655 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c637b9dda2a085daa132f4c1e90448e0c4754cd6e62ce335c271f23c993f77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4afd1351049d54e6fc8464c7c3b32bf91337aadd1a1a8832eb9b8d72ff9da06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.904651 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6625580c07bb67ac50a6ffa9e19fff88bb9602760eda24b0affd6579717c7ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.911458 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.911493 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.911507 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.911527 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.911543 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:01Z","lastTransitionTime":"2025-09-30T13:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.922014 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.934771 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:01 crc kubenswrapper[4936]: I0930 13:40:01.990785 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:53Z\\\",\\\"message\\\":\\\"thCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.250],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0930 13:39:53.763763 6481 obj_retry.go:409] Going to retry *v1.Pod resource setup for 14 objects: [openshift-multus/network-metrics-daemon-2v46m openshift-network-operator/iptables-alerter-4ln5h openshift-dns/node-resolver-5zj44 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-7vnws openshift-image-registry/node-ca-fx6ff openshift-multus/multus-additional-cni-plugins-jzqxn openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6 openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-vxjrh openshift-network-diagnostics/network-check-target-xd92c]\\\\nF0930 13:39:53.763776 6481 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:01Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.004981 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.013722 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.013755 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.013766 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.013782 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.013795 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:02Z","lastTransitionTime":"2025-09-30T13:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.021056 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.035703 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.050061 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.062114 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.075421 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.086790 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:02Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.116160 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.116195 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.116206 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.116225 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.116241 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:02Z","lastTransitionTime":"2025-09-30T13:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.218804 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.219115 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.219180 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.219253 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.219311 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:02Z","lastTransitionTime":"2025-09-30T13:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.321708 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.321758 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.321773 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.321795 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.321811 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:02Z","lastTransitionTime":"2025-09-30T13:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.424954 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.425000 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.425016 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.425037 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.425052 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:02Z","lastTransitionTime":"2025-09-30T13:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.527361 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.527439 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.527457 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.527504 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.527521 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:02Z","lastTransitionTime":"2025-09-30T13:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.631595 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.631673 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.631691 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.631722 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.631747 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:02Z","lastTransitionTime":"2025-09-30T13:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.733960 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.734020 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.734039 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.734064 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.734082 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:02Z","lastTransitionTime":"2025-09-30T13:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.836931 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.836989 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.837009 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.837034 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.837051 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:02Z","lastTransitionTime":"2025-09-30T13:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.917298 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.917427 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.917491 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.917537 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:02 crc kubenswrapper[4936]: E0930 13:40:02.917597 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:40:02 crc kubenswrapper[4936]: E0930 13:40:02.917644 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:40:02 crc kubenswrapper[4936]: E0930 13:40:02.917670 4936 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:40:02 crc kubenswrapper[4936]: E0930 13:40:02.917719 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:40:02 crc kubenswrapper[4936]: E0930 13:40:02.917735 4936 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:40:02 crc kubenswrapper[4936]: E0930 13:40:02.917785 4936 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:40:02 crc kubenswrapper[4936]: E0930 13:40:02.917754 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:40:02 crc kubenswrapper[4936]: E0930 13:40:02.917859 4936 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:40:02 crc kubenswrapper[4936]: E0930 13:40:02.917835 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:40:34.917797234 +0000 UTC m=+85.301799565 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:40:02 crc kubenswrapper[4936]: E0930 13:40:02.917957 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:40:34.917922898 +0000 UTC m=+85.301925239 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:40:02 crc kubenswrapper[4936]: E0930 13:40:02.917984 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:40:34.917969319 +0000 UTC m=+85.301971660 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:40:02 crc kubenswrapper[4936]: E0930 13:40:02.918018 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:40:34.91800822 +0000 UTC m=+85.302010561 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.940821 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.940903 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.940954 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.940988 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:02 crc kubenswrapper[4936]: I0930 13:40:02.941025 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:02Z","lastTransitionTime":"2025-09-30T13:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.018663 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:40:03 crc kubenswrapper[4936]: E0930 13:40:03.018854 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:40:35.018827523 +0000 UTC m=+85.402829824 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.043313 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.043388 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.043401 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.043422 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.043437 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:03Z","lastTransitionTime":"2025-09-30T13:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.146019 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.146061 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.146069 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.146086 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.146096 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:03Z","lastTransitionTime":"2025-09-30T13:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.247970 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.248009 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.248019 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.248087 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.248115 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:03Z","lastTransitionTime":"2025-09-30T13:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.314783 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.314863 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.314934 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:03 crc kubenswrapper[4936]: E0930 13:40:03.314925 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.315020 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:03 crc kubenswrapper[4936]: E0930 13:40:03.315087 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:03 crc kubenswrapper[4936]: E0930 13:40:03.315035 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:03 crc kubenswrapper[4936]: E0930 13:40:03.315320 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.350507 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.350548 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.350559 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.350578 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.350598 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:03Z","lastTransitionTime":"2025-09-30T13:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.453231 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.453284 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.453297 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.453315 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.453345 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:03Z","lastTransitionTime":"2025-09-30T13:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.555292 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.555415 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.555427 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.555449 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.555462 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:03Z","lastTransitionTime":"2025-09-30T13:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.658172 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.658232 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.658244 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.658266 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.658281 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:03Z","lastTransitionTime":"2025-09-30T13:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.761520 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.761605 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.761628 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.761660 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.761683 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:03Z","lastTransitionTime":"2025-09-30T13:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.864623 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.864692 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.864715 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.864749 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.864780 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:03Z","lastTransitionTime":"2025-09-30T13:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.967297 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.967366 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.967384 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.967405 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:03 crc kubenswrapper[4936]: I0930 13:40:03.967417 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:03Z","lastTransitionTime":"2025-09-30T13:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.071737 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.071792 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.071810 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.071839 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.071858 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:04Z","lastTransitionTime":"2025-09-30T13:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.174718 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.174761 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.174782 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.174805 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.174819 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:04Z","lastTransitionTime":"2025-09-30T13:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.277483 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.277571 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.277587 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.277608 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.277623 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:04Z","lastTransitionTime":"2025-09-30T13:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.380076 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.380526 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.380682 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.380846 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.380992 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:04Z","lastTransitionTime":"2025-09-30T13:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.483137 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.483191 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.483204 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.483220 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.483235 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:04Z","lastTransitionTime":"2025-09-30T13:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.585815 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.585853 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.585864 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.585879 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.585889 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:04Z","lastTransitionTime":"2025-09-30T13:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.688776 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.688818 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.688828 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.688843 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.688853 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:04Z","lastTransitionTime":"2025-09-30T13:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.791783 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.791830 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.791838 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.791854 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.791877 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:04Z","lastTransitionTime":"2025-09-30T13:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.894540 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.894592 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.894606 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.894638 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.894650 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:04Z","lastTransitionTime":"2025-09-30T13:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.997871 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.997920 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.997933 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.997954 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:04 crc kubenswrapper[4936]: I0930 13:40:04.997966 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:04Z","lastTransitionTime":"2025-09-30T13:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.101396 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.101475 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.101501 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.101527 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.101546 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:05Z","lastTransitionTime":"2025-09-30T13:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.205739 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.205793 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.205809 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.205833 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.205850 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:05Z","lastTransitionTime":"2025-09-30T13:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.308115 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.308159 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.308170 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.308189 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.308200 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:05Z","lastTransitionTime":"2025-09-30T13:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.314686 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.314744 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.314794 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:05 crc kubenswrapper[4936]: E0930 13:40:05.314817 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.314833 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:05 crc kubenswrapper[4936]: E0930 13:40:05.315033 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:05 crc kubenswrapper[4936]: E0930 13:40:05.315123 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:05 crc kubenswrapper[4936]: E0930 13:40:05.315450 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.410951 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.411027 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.411049 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.411078 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.411099 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:05Z","lastTransitionTime":"2025-09-30T13:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.513939 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.513993 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.514005 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.514024 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.514413 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:05Z","lastTransitionTime":"2025-09-30T13:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.616285 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.616374 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.616384 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.616400 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.616408 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:05Z","lastTransitionTime":"2025-09-30T13:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.718549 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.718582 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.718591 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.718605 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.718615 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:05Z","lastTransitionTime":"2025-09-30T13:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.822164 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.822217 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.822235 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.822261 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.822278 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:05Z","lastTransitionTime":"2025-09-30T13:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.924944 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.924996 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.925007 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.925027 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:05 crc kubenswrapper[4936]: I0930 13:40:05.925039 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:05Z","lastTransitionTime":"2025-09-30T13:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.028093 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.028164 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.028182 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.028206 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.028222 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:06Z","lastTransitionTime":"2025-09-30T13:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.130998 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.131032 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.131040 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.131055 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.131063 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:06Z","lastTransitionTime":"2025-09-30T13:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.233760 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.233806 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.233815 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.233831 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.233840 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:06Z","lastTransitionTime":"2025-09-30T13:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.336085 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.336131 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.336140 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.336155 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.336164 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:06Z","lastTransitionTime":"2025-09-30T13:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.438911 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.438955 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.438971 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.438990 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.439004 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:06Z","lastTransitionTime":"2025-09-30T13:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.541081 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.541361 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.541443 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.541533 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.541592 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:06Z","lastTransitionTime":"2025-09-30T13:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.644884 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.644967 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.644977 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.644994 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.645004 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:06Z","lastTransitionTime":"2025-09-30T13:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.747933 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.748202 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.748291 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.748388 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.748465 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:06Z","lastTransitionTime":"2025-09-30T13:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.851462 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.851516 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.851526 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.851541 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.851550 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:06Z","lastTransitionTime":"2025-09-30T13:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.954527 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.954607 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.954630 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.954666 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:06 crc kubenswrapper[4936]: I0930 13:40:06.954688 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:06Z","lastTransitionTime":"2025-09-30T13:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.057157 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.057216 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.057236 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.057264 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.057281 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:07Z","lastTransitionTime":"2025-09-30T13:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.159263 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.159330 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.159380 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.159411 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.159433 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:07Z","lastTransitionTime":"2025-09-30T13:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.262465 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.262544 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.262567 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.262599 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.262624 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:07Z","lastTransitionTime":"2025-09-30T13:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.314713 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.314759 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.314765 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.314770 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:07 crc kubenswrapper[4936]: E0930 13:40:07.314926 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:07 crc kubenswrapper[4936]: E0930 13:40:07.315056 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:07 crc kubenswrapper[4936]: E0930 13:40:07.315179 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:07 crc kubenswrapper[4936]: E0930 13:40:07.315318 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.366838 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.366919 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.366943 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.366976 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.367000 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:07Z","lastTransitionTime":"2025-09-30T13:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.469587 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.469643 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.469658 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.469676 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.469688 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:07Z","lastTransitionTime":"2025-09-30T13:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.571949 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.572046 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.572063 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.572080 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.572090 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:07Z","lastTransitionTime":"2025-09-30T13:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.675102 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.675145 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.675157 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.675175 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.675187 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:07Z","lastTransitionTime":"2025-09-30T13:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.777934 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.778012 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.778031 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.778061 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.778081 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:07Z","lastTransitionTime":"2025-09-30T13:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.881000 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.881043 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.881051 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.881067 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.881079 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:07Z","lastTransitionTime":"2025-09-30T13:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.983515 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.983561 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.983572 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.983589 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:07 crc kubenswrapper[4936]: I0930 13:40:07.983601 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:07Z","lastTransitionTime":"2025-09-30T13:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.086859 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.086912 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.086924 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.086945 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.086960 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:08Z","lastTransitionTime":"2025-09-30T13:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.189738 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.189788 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.189803 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.189820 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.189830 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:08Z","lastTransitionTime":"2025-09-30T13:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.292575 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.292662 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.292673 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.292694 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.292705 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:08Z","lastTransitionTime":"2025-09-30T13:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.396507 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.396557 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.396567 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.396587 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.396602 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:08Z","lastTransitionTime":"2025-09-30T13:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.499606 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.499712 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.499726 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.499745 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.499761 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:08Z","lastTransitionTime":"2025-09-30T13:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.602113 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.602173 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.602186 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.602205 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.602216 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:08Z","lastTransitionTime":"2025-09-30T13:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.704299 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.704392 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.704408 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.704434 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.704450 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:08Z","lastTransitionTime":"2025-09-30T13:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.807081 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.807133 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.807150 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.807169 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.807186 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:08Z","lastTransitionTime":"2025-09-30T13:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.910261 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.910303 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.910311 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.910328 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:08 crc kubenswrapper[4936]: I0930 13:40:08.910356 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:08Z","lastTransitionTime":"2025-09-30T13:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.012744 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.012811 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.012834 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.012866 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.012889 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:09Z","lastTransitionTime":"2025-09-30T13:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.115700 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.115777 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.115794 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.115821 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.115837 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:09Z","lastTransitionTime":"2025-09-30T13:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.218151 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.218245 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.218255 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.218270 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.218279 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:09Z","lastTransitionTime":"2025-09-30T13:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.315111 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.315163 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.315191 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.315133 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:09 crc kubenswrapper[4936]: E0930 13:40:09.315401 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:09 crc kubenswrapper[4936]: E0930 13:40:09.315498 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:09 crc kubenswrapper[4936]: E0930 13:40:09.315601 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:09 crc kubenswrapper[4936]: E0930 13:40:09.315690 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.321457 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.321531 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.321556 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.321588 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.321609 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:09Z","lastTransitionTime":"2025-09-30T13:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.424717 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.424817 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.424836 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.424863 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.424886 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:09Z","lastTransitionTime":"2025-09-30T13:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.527999 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.528050 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.528062 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.528089 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.528104 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:09Z","lastTransitionTime":"2025-09-30T13:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.630966 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.630997 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.631008 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.631024 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.631033 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:09Z","lastTransitionTime":"2025-09-30T13:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.736217 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.736258 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.736268 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.736285 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.736295 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:09Z","lastTransitionTime":"2025-09-30T13:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.839120 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.839173 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.839184 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.839203 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.839240 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:09Z","lastTransitionTime":"2025-09-30T13:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.941207 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.941253 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.941276 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.941295 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:09 crc kubenswrapper[4936]: I0930 13:40:09.941303 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:09Z","lastTransitionTime":"2025-09-30T13:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.043768 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.043810 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.043819 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.043840 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.043850 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:10Z","lastTransitionTime":"2025-09-30T13:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.146388 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.146447 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.146464 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.146487 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.146506 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:10Z","lastTransitionTime":"2025-09-30T13:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.249993 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.251324 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.251595 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.251824 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.252160 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:10Z","lastTransitionTime":"2025-09-30T13:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.316174 4936 scope.go:117] "RemoveContainer" containerID="a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae" Sep 30 13:40:10 crc kubenswrapper[4936]: E0930 13:40:10.317089 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.335067 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.351774 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.355621 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.355674 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.355690 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.355715 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.355741 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:10Z","lastTransitionTime":"2025-09-30T13:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.370569 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c637b9dda2a085daa132f4c1e90448e0c4754cd6e62ce335c271f23c993f77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4afd1351049d54e6fc8464c7c3b32bf91337aadd1a1a8832eb9b8d72ff9da06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.392778 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437ce259-4db0-4fa7-add8-5f747c7e7fbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c4ee9d83ec4799fb66b7b47123c77b9dae4dbbd5f06bfda032567297e0939c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08e2b652bd5ea66b08b8186e9e4c204d9bb24e98d561f8410614dcfeaebaac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747d0723207eb1c09444a9cd9f8f52b45b0851c975dfbc81413aafaaa4469fca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.414356 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.433542 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.453512 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.458372 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.458404 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.458415 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.458431 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.458441 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:10Z","lastTransitionTime":"2025-09-30T13:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.471454 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.486952 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6625580c07bb67ac50a6ffa9e19fff88bb9602760eda24b0affd6579717c7ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.503543 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.517891 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.538982 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:53Z\\\",\\\"message\\\":\\\"thCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.250],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0930 13:39:53.763763 6481 obj_retry.go:409] Going to retry *v1.Pod resource setup for 14 objects: [openshift-multus/network-metrics-daemon-2v46m openshift-network-operator/iptables-alerter-4ln5h openshift-dns/node-resolver-5zj44 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-7vnws openshift-image-registry/node-ca-fx6ff openshift-multus/multus-additional-cni-plugins-jzqxn openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6 openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-vxjrh openshift-network-diagnostics/network-check-target-xd92c]\\\\nF0930 13:39:53.763776 6481 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.550122 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.561666 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.561772 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.561800 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.561834 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.561854 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:10Z","lastTransitionTime":"2025-09-30T13:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.563222 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.577195 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.587780 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.596187 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.664004 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.664042 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.664054 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.664071 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.664084 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:10Z","lastTransitionTime":"2025-09-30T13:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.766076 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.766114 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.766125 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.766139 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.766147 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:10Z","lastTransitionTime":"2025-09-30T13:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.868610 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.868911 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.869092 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.869190 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.869270 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:10Z","lastTransitionTime":"2025-09-30T13:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.971633 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.971678 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.971691 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.971711 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:10 crc kubenswrapper[4936]: I0930 13:40:10.971726 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:10Z","lastTransitionTime":"2025-09-30T13:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.074271 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.074341 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.074351 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.074368 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.074377 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:11Z","lastTransitionTime":"2025-09-30T13:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.177312 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.178108 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.178153 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.178185 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.178206 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:11Z","lastTransitionTime":"2025-09-30T13:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.191006 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.191052 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.191063 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.191080 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.191090 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:11Z","lastTransitionTime":"2025-09-30T13:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:11 crc kubenswrapper[4936]: E0930 13:40:11.207932 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.213659 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.213710 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.213721 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.213746 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.213758 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:11Z","lastTransitionTime":"2025-09-30T13:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:11 crc kubenswrapper[4936]: E0930 13:40:11.229026 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.236123 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.236494 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.236739 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.236952 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.237154 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:11Z","lastTransitionTime":"2025-09-30T13:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:11 crc kubenswrapper[4936]: E0930 13:40:11.254527 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.259507 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.259582 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.259616 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.259644 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.259663 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:11Z","lastTransitionTime":"2025-09-30T13:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:11 crc kubenswrapper[4936]: E0930 13:40:11.281765 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.290162 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.290210 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.290221 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.290243 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.290255 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:11Z","lastTransitionTime":"2025-09-30T13:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:11 crc kubenswrapper[4936]: E0930 13:40:11.312861 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:11 crc kubenswrapper[4936]: E0930 13:40:11.313406 4936 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.315279 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:11 crc kubenswrapper[4936]: E0930 13:40:11.315484 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.315556 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.315653 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.315771 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.315784 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.315801 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.315717 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.315811 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:11Z","lastTransitionTime":"2025-09-30T13:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:11 crc kubenswrapper[4936]: E0930 13:40:11.316146 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:11 crc kubenswrapper[4936]: E0930 13:40:11.316262 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.316223 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:11 crc kubenswrapper[4936]: E0930 13:40:11.316700 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.418730 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.419088 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.419183 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.419284 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.419392 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:11Z","lastTransitionTime":"2025-09-30T13:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.521911 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.521959 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.521974 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.522076 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.522095 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:11Z","lastTransitionTime":"2025-09-30T13:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.624106 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.624160 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.624173 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.624191 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.624201 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:11Z","lastTransitionTime":"2025-09-30T13:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.726309 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.726374 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.726383 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.726402 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.726412 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:11Z","lastTransitionTime":"2025-09-30T13:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.829431 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.829479 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.829490 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.829509 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.829521 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:11Z","lastTransitionTime":"2025-09-30T13:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.932244 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.932299 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.932309 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.932346 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:11 crc kubenswrapper[4936]: I0930 13:40:11.932357 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:11Z","lastTransitionTime":"2025-09-30T13:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.035010 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.035067 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.035084 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.035106 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.035122 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:12Z","lastTransitionTime":"2025-09-30T13:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.139904 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.139989 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.140014 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.140047 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.140068 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:12Z","lastTransitionTime":"2025-09-30T13:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.243676 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.243730 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.243744 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.243765 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.243779 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:12Z","lastTransitionTime":"2025-09-30T13:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.345888 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.345929 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.345975 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.346002 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.346014 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:12Z","lastTransitionTime":"2025-09-30T13:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.448586 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.448631 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.448642 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.448659 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.448672 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:12Z","lastTransitionTime":"2025-09-30T13:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.550932 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.550984 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.550995 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.551010 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.551018 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:12Z","lastTransitionTime":"2025-09-30T13:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.653300 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.653371 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.653384 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.653405 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.653418 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:12Z","lastTransitionTime":"2025-09-30T13:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.755661 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.755713 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.755722 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.755740 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.755755 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:12Z","lastTransitionTime":"2025-09-30T13:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.857664 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.857710 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.857719 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.857734 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.857744 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:12Z","lastTransitionTime":"2025-09-30T13:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.960690 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.960775 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.960789 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.960811 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:12 crc kubenswrapper[4936]: I0930 13:40:12.960825 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:12Z","lastTransitionTime":"2025-09-30T13:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.064133 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.064189 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.064202 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.064226 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.064240 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:13Z","lastTransitionTime":"2025-09-30T13:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.166732 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.166789 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.166803 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.166823 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.166836 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:13Z","lastTransitionTime":"2025-09-30T13:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.268866 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.268915 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.268929 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.268948 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.268961 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:13Z","lastTransitionTime":"2025-09-30T13:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.315288 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.315306 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.315355 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.315288 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:13 crc kubenswrapper[4936]: E0930 13:40:13.315438 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:13 crc kubenswrapper[4936]: E0930 13:40:13.315499 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:13 crc kubenswrapper[4936]: E0930 13:40:13.315557 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:13 crc kubenswrapper[4936]: E0930 13:40:13.315677 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.372752 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.372815 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.372828 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.372849 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.372868 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:13Z","lastTransitionTime":"2025-09-30T13:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.476200 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.476249 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.476260 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.476278 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.476287 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:13Z","lastTransitionTime":"2025-09-30T13:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.578881 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.578925 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.578933 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.578948 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.578957 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:13Z","lastTransitionTime":"2025-09-30T13:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.681612 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.681643 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.681651 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.681665 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.681674 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:13Z","lastTransitionTime":"2025-09-30T13:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.784132 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.784185 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.784193 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.784209 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.784221 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:13Z","lastTransitionTime":"2025-09-30T13:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.886180 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.886484 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.886561 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.886647 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.886728 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:13Z","lastTransitionTime":"2025-09-30T13:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.989510 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.989576 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.989593 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.989618 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:13 crc kubenswrapper[4936]: I0930 13:40:13.989637 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:13Z","lastTransitionTime":"2025-09-30T13:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.092272 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.092325 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.092362 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.092386 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.092402 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:14Z","lastTransitionTime":"2025-09-30T13:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.194604 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.194647 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.194655 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.194674 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.194683 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:14Z","lastTransitionTime":"2025-09-30T13:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.297581 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.297645 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.297660 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.297683 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.297696 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:14Z","lastTransitionTime":"2025-09-30T13:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.400938 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.400968 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.400979 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.400995 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.401006 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:14Z","lastTransitionTime":"2025-09-30T13:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.530216 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.530261 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.530272 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.530291 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.530306 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:14Z","lastTransitionTime":"2025-09-30T13:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.633625 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.633720 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.633738 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.633761 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.633778 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:14Z","lastTransitionTime":"2025-09-30T13:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.736790 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.736831 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.736843 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.736861 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.736873 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:14Z","lastTransitionTime":"2025-09-30T13:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.840408 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.840471 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.840511 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.840538 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.840555 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:14Z","lastTransitionTime":"2025-09-30T13:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.944113 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.944172 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.944188 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.944216 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:14 crc kubenswrapper[4936]: I0930 13:40:14.944235 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:14Z","lastTransitionTime":"2025-09-30T13:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.048666 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.048752 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.048774 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.048801 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.048822 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:15Z","lastTransitionTime":"2025-09-30T13:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.152475 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.152528 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.152542 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.152562 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.152575 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:15Z","lastTransitionTime":"2025-09-30T13:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.255534 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.255568 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.255580 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.255595 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.255606 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:15Z","lastTransitionTime":"2025-09-30T13:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.314369 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.314529 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.314618 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:15 crc kubenswrapper[4936]: E0930 13:40:15.314621 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:15 crc kubenswrapper[4936]: E0930 13:40:15.314773 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.314852 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:15 crc kubenswrapper[4936]: E0930 13:40:15.314934 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:15 crc kubenswrapper[4936]: E0930 13:40:15.315064 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.358514 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.358572 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.358590 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.358614 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.358631 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:15Z","lastTransitionTime":"2025-09-30T13:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.461127 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.461171 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.461184 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.461202 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.461214 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:15Z","lastTransitionTime":"2025-09-30T13:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.563430 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.563462 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.563472 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.563487 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.563497 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:15Z","lastTransitionTime":"2025-09-30T13:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.666571 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.666621 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.666630 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.666656 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.666667 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:15Z","lastTransitionTime":"2025-09-30T13:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.768707 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.768753 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.768765 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.768785 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.768797 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:15Z","lastTransitionTime":"2025-09-30T13:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.870520 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.870561 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.870573 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.870593 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.870615 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:15Z","lastTransitionTime":"2025-09-30T13:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.972842 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.973072 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.973199 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.973281 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:15 crc kubenswrapper[4936]: I0930 13:40:15.973361 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:15Z","lastTransitionTime":"2025-09-30T13:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.058794 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs\") pod \"network-metrics-daemon-2v46m\" (UID: \"e3bd8048-3efa-41ed-a7ff-8d477db72be7\") " pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:16 crc kubenswrapper[4936]: E0930 13:40:16.058981 4936 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:40:16 crc kubenswrapper[4936]: E0930 13:40:16.059049 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs podName:e3bd8048-3efa-41ed-a7ff-8d477db72be7 nodeName:}" failed. No retries permitted until 2025-09-30 13:40:48.059025479 +0000 UTC m=+98.443027780 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs") pod "network-metrics-daemon-2v46m" (UID: "e3bd8048-3efa-41ed-a7ff-8d477db72be7") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.076050 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.076090 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.076101 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.076118 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.076129 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:16Z","lastTransitionTime":"2025-09-30T13:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.178158 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.178202 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.178211 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.178228 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.178240 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:16Z","lastTransitionTime":"2025-09-30T13:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.280839 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.280883 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.280894 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.280912 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.280924 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:16Z","lastTransitionTime":"2025-09-30T13:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.383120 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.383156 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.383179 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.383195 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.383206 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:16Z","lastTransitionTime":"2025-09-30T13:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.486583 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.486638 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.486649 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.486670 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.486686 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:16Z","lastTransitionTime":"2025-09-30T13:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.590426 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.590463 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.590473 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.590491 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.590500 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:16Z","lastTransitionTime":"2025-09-30T13:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.694636 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.694670 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.694681 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.694697 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.694709 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:16Z","lastTransitionTime":"2025-09-30T13:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.796238 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.796272 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.796282 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.796299 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.796310 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:16Z","lastTransitionTime":"2025-09-30T13:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.899299 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.899353 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.899362 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.899380 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:16 crc kubenswrapper[4936]: I0930 13:40:16.899391 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:16Z","lastTransitionTime":"2025-09-30T13:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.002532 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.002600 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.002620 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.002644 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.002665 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:17Z","lastTransitionTime":"2025-09-30T13:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.106441 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.106701 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.106800 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.106905 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.106995 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:17Z","lastTransitionTime":"2025-09-30T13:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.208963 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.209221 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.209314 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.209432 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.209541 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:17Z","lastTransitionTime":"2025-09-30T13:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.312463 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.312765 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.312889 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.312986 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.313085 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:17Z","lastTransitionTime":"2025-09-30T13:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.314683 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:17 crc kubenswrapper[4936]: E0930 13:40:17.314796 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.314953 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.315058 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.315017 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:17 crc kubenswrapper[4936]: E0930 13:40:17.315281 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:17 crc kubenswrapper[4936]: E0930 13:40:17.315524 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:17 crc kubenswrapper[4936]: E0930 13:40:17.315751 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.417557 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.417598 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.417609 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.417623 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.417633 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:17Z","lastTransitionTime":"2025-09-30T13:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.520412 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.520446 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.520457 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.520475 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.520488 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:17Z","lastTransitionTime":"2025-09-30T13:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.629671 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.629703 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.629714 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.629728 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.629737 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:17Z","lastTransitionTime":"2025-09-30T13:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.732029 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.732061 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.732070 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.732085 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.732093 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:17Z","lastTransitionTime":"2025-09-30T13:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.833849 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.833880 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.833889 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.833903 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.833914 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:17Z","lastTransitionTime":"2025-09-30T13:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.935940 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.935975 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.935986 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.935999 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:17 crc kubenswrapper[4936]: I0930 13:40:17.936010 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:17Z","lastTransitionTime":"2025-09-30T13:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.040133 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.040206 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.040222 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.040241 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.040254 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:18Z","lastTransitionTime":"2025-09-30T13:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.142376 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.142401 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.142410 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.142422 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.142432 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:18Z","lastTransitionTime":"2025-09-30T13:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.244071 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.244114 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.244122 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.244136 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.244147 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:18Z","lastTransitionTime":"2025-09-30T13:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.346117 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.346172 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.346193 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.346219 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.346240 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:18Z","lastTransitionTime":"2025-09-30T13:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.448957 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.449635 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.449702 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.449807 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.449893 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:18Z","lastTransitionTime":"2025-09-30T13:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.552632 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.552911 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.553003 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.553099 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.553185 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:18Z","lastTransitionTime":"2025-09-30T13:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.655812 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.655845 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.655853 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.655867 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.655876 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:18Z","lastTransitionTime":"2025-09-30T13:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.754298 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vxjrh_9dbb1e3f-927e-4587-835e-b21370b33262/kube-multus/0.log" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.754370 4936 generic.go:334] "Generic (PLEG): container finished" podID="9dbb1e3f-927e-4587-835e-b21370b33262" containerID="0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5" exitCode=1 Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.754400 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vxjrh" event={"ID":"9dbb1e3f-927e-4587-835e-b21370b33262","Type":"ContainerDied","Data":"0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5"} Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.754777 4936 scope.go:117] "RemoveContainer" containerID="0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.758675 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.758701 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.758712 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.758729 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.758741 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:18Z","lastTransitionTime":"2025-09-30T13:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.776851 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.788273 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.802635 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:40:18Z\\\",\\\"message\\\":\\\"2025-09-30T13:39:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0b305a56-6fcc-48c8-ba77-2b7cd14c1009\\\\n2025-09-30T13:39:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0b305a56-6fcc-48c8-ba77-2b7cd14c1009 to /host/opt/cni/bin/\\\\n2025-09-30T13:39:33Z [verbose] multus-daemon started\\\\n2025-09-30T13:39:33Z [verbose] Readiness Indicator file check\\\\n2025-09-30T13:40:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.818240 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.831237 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c637b9dda2a085daa132f4c1e90448e0c4754cd6e62ce335c271f23c993f77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4afd1351049d54e6fc8464c7c3b32bf91337aadd1a1a8832eb9b8d72ff9da06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.843698 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437ce259-4db0-4fa7-add8-5f747c7e7fbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c4ee9d83ec4799fb66b7b47123c77b9dae4dbbd5f06bfda032567297e0939c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08e2b652bd5ea66b08b8186e9e4c204d9bb24e98d561f8410614dcfeaebaac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747d0723207eb1c09444a9cd9f8f52b45b0851c975dfbc81413aafaaa4469fca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.858432 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.860724 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.860788 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.860804 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.860827 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.860842 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:18Z","lastTransitionTime":"2025-09-30T13:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.872933 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.895515 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:53Z\\\",\\\"message\\\":\\\"thCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.250],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0930 13:39:53.763763 6481 obj_retry.go:409] Going to retry *v1.Pod resource setup for 14 objects: [openshift-multus/network-metrics-daemon-2v46m openshift-network-operator/iptables-alerter-4ln5h openshift-dns/node-resolver-5zj44 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-7vnws openshift-image-registry/node-ca-fx6ff openshift-multus/multus-additional-cni-plugins-jzqxn openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6 openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-vxjrh openshift-network-diagnostics/network-check-target-xd92c]\\\\nF0930 13:39:53.763776 6481 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.906188 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.918002 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6625580c07bb67ac50a6ffa9e19fff88bb9602760eda24b0affd6579717c7ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.930951 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.943074 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.954006 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.963365 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.963400 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.963412 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.963426 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.963437 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:18Z","lastTransitionTime":"2025-09-30T13:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.966314 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.980755 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:18 crc kubenswrapper[4936]: I0930 13:40:18.993187 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:18Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.065305 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.065369 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.065378 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.065392 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.065401 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:19Z","lastTransitionTime":"2025-09-30T13:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.167420 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.167457 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.167466 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.167498 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.167507 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:19Z","lastTransitionTime":"2025-09-30T13:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.270252 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.270304 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.270321 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.270379 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.270395 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:19Z","lastTransitionTime":"2025-09-30T13:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.314657 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.314671 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.314690 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:19 crc kubenswrapper[4936]: E0930 13:40:19.315088 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:19 crc kubenswrapper[4936]: E0930 13:40:19.314954 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.314715 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:19 crc kubenswrapper[4936]: E0930 13:40:19.315193 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:19 crc kubenswrapper[4936]: E0930 13:40:19.315237 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.375512 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.375831 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.375853 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.375869 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.375883 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:19Z","lastTransitionTime":"2025-09-30T13:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.481210 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.481433 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.481522 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.481985 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.482008 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:19Z","lastTransitionTime":"2025-09-30T13:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.584212 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.584244 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.584253 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.584267 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.584278 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:19Z","lastTransitionTime":"2025-09-30T13:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.687007 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.687054 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.687068 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.687089 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.687102 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:19Z","lastTransitionTime":"2025-09-30T13:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.760087 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vxjrh_9dbb1e3f-927e-4587-835e-b21370b33262/kube-multus/0.log" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.760139 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vxjrh" event={"ID":"9dbb1e3f-927e-4587-835e-b21370b33262","Type":"ContainerStarted","Data":"c2dd4dee574c3aee3fe81fee19f41aa90b0a6340eb8677847a2006a1ba906e34"} Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.771617 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437ce259-4db0-4fa7-add8-5f747c7e7fbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c4ee9d83ec4799fb66b7b47123c77b9dae4dbbd5f06bfda032567297e0939c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08e2b652bd5ea66b08b8186e9e4c204d9bb24e98d561f8410614dcfeaebaac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747d0723207eb1c09444a9cd9f8f52b45b0851c975dfbc81413aafaaa4469fca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:19Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.786088 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:19Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.788935 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.788961 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.788970 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.788982 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.788992 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:19Z","lastTransitionTime":"2025-09-30T13:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.794602 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:19Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.805518 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd4dee574c3aee3fe81fee19f41aa90b0a6340eb8677847a2006a1ba906e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:40:18Z\\\",\\\"message\\\":\\\"2025-09-30T13:39:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0b305a56-6fcc-48c8-ba77-2b7cd14c1009\\\\n2025-09-30T13:39:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0b305a56-6fcc-48c8-ba77-2b7cd14c1009 to /host/opt/cni/bin/\\\\n2025-09-30T13:39:33Z [verbose] multus-daemon started\\\\n2025-09-30T13:39:33Z [verbose] Readiness Indicator file check\\\\n2025-09-30T13:40:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:40:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:19Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.823400 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:19Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.834382 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c637b9dda2a085daa132f4c1e90448e0c4754cd6e62ce335c271f23c993f77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4afd1351049d54e6fc8464c7c3b32bf91337aadd1a1a8832eb9b8d72ff9da06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:19Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.848119 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6625580c07bb67ac50a6ffa9e19fff88bb9602760eda24b0affd6579717c7ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:19Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.860366 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:19Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.873391 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:19Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.891717 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.891742 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.891750 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.891763 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.891771 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:19Z","lastTransitionTime":"2025-09-30T13:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.891670 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:53Z\\\",\\\"message\\\":\\\"thCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.250],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0930 13:39:53.763763 6481 obj_retry.go:409] Going to retry *v1.Pod resource setup for 14 objects: [openshift-multus/network-metrics-daemon-2v46m openshift-network-operator/iptables-alerter-4ln5h openshift-dns/node-resolver-5zj44 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-7vnws openshift-image-registry/node-ca-fx6ff openshift-multus/multus-additional-cni-plugins-jzqxn openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6 openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-vxjrh openshift-network-diagnostics/network-check-target-xd92c]\\\\nF0930 13:39:53.763776 6481 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:19Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.904944 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:19Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.916011 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:19Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.927372 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:19Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.938038 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:19Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.947525 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:19Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.957836 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:19Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.967463 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:19Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.994141 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.994182 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.994190 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.994204 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:19 crc kubenswrapper[4936]: I0930 13:40:19.994213 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:19Z","lastTransitionTime":"2025-09-30T13:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.096892 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.096945 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.096959 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.096977 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.096990 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:20Z","lastTransitionTime":"2025-09-30T13:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.199063 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.199124 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.199139 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.199157 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.199170 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:20Z","lastTransitionTime":"2025-09-30T13:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.300982 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.301026 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.301037 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.301054 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.301067 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:20Z","lastTransitionTime":"2025-09-30T13:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.328930 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c637b9dda2a085daa132f4c1e90448e0c4754cd6e62ce335c271f23c993f77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4afd1351049d54e6fc8464c7c3b32bf91337aadd1a1a8832eb9b8d72ff9da06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.340581 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437ce259-4db0-4fa7-add8-5f747c7e7fbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c4ee9d83ec4799fb66b7b47123c77b9dae4dbbd5f06bfda032567297e0939c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08e2b652bd5ea66b08b8186e9e4c204d9bb24e98d561f8410614dcfeaebaac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747d0723207eb1c09444a9cd9f8f52b45b0851c975dfbc81413aafaaa4469fca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.351414 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.362327 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.375442 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd4dee574c3aee3fe81fee19f41aa90b0a6340eb8677847a2006a1ba906e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:40:18Z\\\",\\\"message\\\":\\\"2025-09-30T13:39:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0b305a56-6fcc-48c8-ba77-2b7cd14c1009\\\\n2025-09-30T13:39:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0b305a56-6fcc-48c8-ba77-2b7cd14c1009 to /host/opt/cni/bin/\\\\n2025-09-30T13:39:33Z [verbose] multus-daemon started\\\\n2025-09-30T13:39:33Z [verbose] Readiness Indicator file check\\\\n2025-09-30T13:40:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:40:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.390265 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.402827 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.402868 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.402879 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.403226 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.403258 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:20Z","lastTransitionTime":"2025-09-30T13:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.410413 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6625580c07bb67ac50a6ffa9e19fff88bb9602760eda24b0affd6579717c7ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.426616 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.438732 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.455327 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:53Z\\\",\\\"message\\\":\\\"thCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.250],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0930 13:39:53.763763 6481 obj_retry.go:409] Going to retry *v1.Pod resource setup for 14 objects: [openshift-multus/network-metrics-daemon-2v46m openshift-network-operator/iptables-alerter-4ln5h openshift-dns/node-resolver-5zj44 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-7vnws openshift-image-registry/node-ca-fx6ff openshift-multus/multus-additional-cni-plugins-jzqxn openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6 openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-vxjrh openshift-network-diagnostics/network-check-target-xd92c]\\\\nF0930 13:39:53.763776 6481 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.465904 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.475996 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.486667 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.500059 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.507089 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.507142 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.507157 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.507187 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.507202 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:20Z","lastTransitionTime":"2025-09-30T13:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.510647 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.522362 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.533094 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:20Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.608962 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.608994 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.609004 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.609020 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.609031 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:20Z","lastTransitionTime":"2025-09-30T13:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.711831 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.711895 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.711908 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.711924 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.711936 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:20Z","lastTransitionTime":"2025-09-30T13:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.814200 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.814232 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.814243 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.814257 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.814265 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:20Z","lastTransitionTime":"2025-09-30T13:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.917648 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.917942 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.918050 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.918183 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:20 crc kubenswrapper[4936]: I0930 13:40:20.918288 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:20Z","lastTransitionTime":"2025-09-30T13:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.021265 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.021318 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.021371 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.021389 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.021399 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:21Z","lastTransitionTime":"2025-09-30T13:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.123801 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.123835 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.123844 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.123857 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.123867 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:21Z","lastTransitionTime":"2025-09-30T13:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.227021 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.227402 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.227597 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.227770 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.227930 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:21Z","lastTransitionTime":"2025-09-30T13:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.314947 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.314968 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.315037 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:21 crc kubenswrapper[4936]: E0930 13:40:21.315074 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.315156 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:21 crc kubenswrapper[4936]: E0930 13:40:21.315286 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:21 crc kubenswrapper[4936]: E0930 13:40:21.315472 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:21 crc kubenswrapper[4936]: E0930 13:40:21.315573 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.330978 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.331017 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.331029 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.331044 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.331055 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:21Z","lastTransitionTime":"2025-09-30T13:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.432982 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.433258 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.433359 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.433477 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.433552 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:21Z","lastTransitionTime":"2025-09-30T13:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.511514 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.511551 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.511562 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.511578 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.511588 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:21Z","lastTransitionTime":"2025-09-30T13:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:21 crc kubenswrapper[4936]: E0930 13:40:21.524472 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:21Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.527657 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.527830 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.527919 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.528034 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.528124 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:21Z","lastTransitionTime":"2025-09-30T13:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:21 crc kubenswrapper[4936]: E0930 13:40:21.540475 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:21Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.543969 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.544003 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.544014 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.544028 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.544038 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:21Z","lastTransitionTime":"2025-09-30T13:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:21 crc kubenswrapper[4936]: E0930 13:40:21.556108 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:21Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.560795 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.560830 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.560841 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.560856 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.560866 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:21Z","lastTransitionTime":"2025-09-30T13:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:21 crc kubenswrapper[4936]: E0930 13:40:21.571632 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:21Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.575395 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.575427 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.575439 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.575454 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.575465 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:21Z","lastTransitionTime":"2025-09-30T13:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:21 crc kubenswrapper[4936]: E0930 13:40:21.586358 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:21Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:21 crc kubenswrapper[4936]: E0930 13:40:21.586496 4936 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.588028 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.588374 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.588485 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.588589 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.588816 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:21Z","lastTransitionTime":"2025-09-30T13:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.691507 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.691537 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.691547 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.691562 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.691572 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:21Z","lastTransitionTime":"2025-09-30T13:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.793857 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.793910 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.793919 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.793937 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.793948 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:21Z","lastTransitionTime":"2025-09-30T13:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.895488 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.895854 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.895928 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.896006 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.896070 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:21Z","lastTransitionTime":"2025-09-30T13:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.998450 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.998497 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.998511 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.998528 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:21 crc kubenswrapper[4936]: I0930 13:40:21.998540 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:21Z","lastTransitionTime":"2025-09-30T13:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.101119 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.101155 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.101164 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.101177 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.101187 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:22Z","lastTransitionTime":"2025-09-30T13:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.203293 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.203329 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.203358 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.203374 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.203383 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:22Z","lastTransitionTime":"2025-09-30T13:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.306308 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.306589 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.306676 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.306758 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.306820 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:22Z","lastTransitionTime":"2025-09-30T13:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.315453 4936 scope.go:117] "RemoveContainer" containerID="a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.409317 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.409375 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.409389 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.409407 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.409425 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:22Z","lastTransitionTime":"2025-09-30T13:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.512205 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.512244 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.512253 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.512267 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.512277 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:22Z","lastTransitionTime":"2025-09-30T13:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.614781 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.614808 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.614816 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.614829 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.614837 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:22Z","lastTransitionTime":"2025-09-30T13:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.717200 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.717254 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.717266 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.717284 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.717296 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:22Z","lastTransitionTime":"2025-09-30T13:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.770271 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vnws_166715eb-a672-4111-b64e-626a0f7b0d74/ovnkube-controller/2.log" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.773093 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerStarted","Data":"c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35"} Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.773527 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.792434 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6625580c07bb67ac50a6ffa9e19fff88bb9602760eda24b0affd6579717c7ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.803819 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.819611 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.819652 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.819661 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.819677 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.819686 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:22Z","lastTransitionTime":"2025-09-30T13:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.825477 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.843527 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:53Z\\\",\\\"message\\\":\\\"thCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.250],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0930 13:39:53.763763 6481 obj_retry.go:409] Going to retry *v1.Pod resource setup for 14 objects: [openshift-multus/network-metrics-daemon-2v46m openshift-network-operator/iptables-alerter-4ln5h openshift-dns/node-resolver-5zj44 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-7vnws openshift-image-registry/node-ca-fx6ff openshift-multus/multus-additional-cni-plugins-jzqxn openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6 openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-vxjrh openshift-network-diagnostics/network-check-target-xd92c]\\\\nF0930 13:39:53.763776 6481 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.853686 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.864761 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.883116 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.898024 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.908099 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.921404 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.921449 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.921457 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.921470 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.921486 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:22Z","lastTransitionTime":"2025-09-30T13:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.921671 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.933675 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.943358 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c637b9dda2a085daa132f4c1e90448e0c4754cd6e62ce335c271f23c993f77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4afd1351049d54e6fc8464c7c3b32bf91337aadd1a1a8832eb9b8d72ff9da06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.957324 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437ce259-4db0-4fa7-add8-5f747c7e7fbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c4ee9d83ec4799fb66b7b47123c77b9dae4dbbd5f06bfda032567297e0939c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08e2b652bd5ea66b08b8186e9e4c204d9bb24e98d561f8410614dcfeaebaac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747d0723207eb1c09444a9cd9f8f52b45b0851c975dfbc81413aafaaa4469fca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.969424 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.981198 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:22 crc kubenswrapper[4936]: I0930 13:40:22.995397 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd4dee574c3aee3fe81fee19f41aa90b0a6340eb8677847a2006a1ba906e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:40:18Z\\\",\\\"message\\\":\\\"2025-09-30T13:39:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0b305a56-6fcc-48c8-ba77-2b7cd14c1009\\\\n2025-09-30T13:39:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0b305a56-6fcc-48c8-ba77-2b7cd14c1009 to /host/opt/cni/bin/\\\\n2025-09-30T13:39:33Z [verbose] multus-daemon started\\\\n2025-09-30T13:39:33Z [verbose] Readiness Indicator file check\\\\n2025-09-30T13:40:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:40:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:22Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.008824 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.023388 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.023436 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.023448 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.023466 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.023478 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:23Z","lastTransitionTime":"2025-09-30T13:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.125554 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.125592 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.125604 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.125621 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.125633 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:23Z","lastTransitionTime":"2025-09-30T13:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.228053 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.228091 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.228102 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.228117 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.228127 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:23Z","lastTransitionTime":"2025-09-30T13:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.315101 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.315159 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.315096 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:23 crc kubenswrapper[4936]: E0930 13:40:23.315221 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:23 crc kubenswrapper[4936]: E0930 13:40:23.315364 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:23 crc kubenswrapper[4936]: E0930 13:40:23.315441 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.315464 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:23 crc kubenswrapper[4936]: E0930 13:40:23.315542 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.331004 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.331206 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.331373 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.331543 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.331687 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:23Z","lastTransitionTime":"2025-09-30T13:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.434908 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.434953 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.434963 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.434979 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.434988 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:23Z","lastTransitionTime":"2025-09-30T13:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.537527 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.537571 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.537583 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.537603 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.537621 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:23Z","lastTransitionTime":"2025-09-30T13:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.640199 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.640253 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.640268 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.640290 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.640314 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:23Z","lastTransitionTime":"2025-09-30T13:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.742489 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.742536 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.742547 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.742569 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.742582 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:23Z","lastTransitionTime":"2025-09-30T13:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.778079 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vnws_166715eb-a672-4111-b64e-626a0f7b0d74/ovnkube-controller/3.log" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.778789 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vnws_166715eb-a672-4111-b64e-626a0f7b0d74/ovnkube-controller/2.log" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.781533 4936 generic.go:334] "Generic (PLEG): container finished" podID="166715eb-a672-4111-b64e-626a0f7b0d74" containerID="c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35" exitCode=1 Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.781582 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerDied","Data":"c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35"} Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.781633 4936 scope.go:117] "RemoveContainer" containerID="a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.782617 4936 scope.go:117] "RemoveContainer" containerID="c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35" Sep 30 13:40:23 crc kubenswrapper[4936]: E0930 13:40:23.782868 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.800550 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.820604 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a98b0f5b2aa89fcf669c338202959c207f8fdaf20b2cfe963dfb599cf0f42aae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:39:53Z\\\",\\\"message\\\":\\\"thCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.250],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0930 13:39:53.763763 6481 obj_retry.go:409] Going to retry *v1.Pod resource setup for 14 objects: [openshift-multus/network-metrics-daemon-2v46m openshift-network-operator/iptables-alerter-4ln5h openshift-dns/node-resolver-5zj44 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-7vnws openshift-image-registry/node-ca-fx6ff openshift-multus/multus-additional-cni-plugins-jzqxn openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6 openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-vxjrh openshift-network-diagnostics/network-check-target-xd92c]\\\\nF0930 13:39:53.763776 6481 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:40:23Z\\\",\\\"message\\\":\\\"39 6852 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 13:40:23.060251 6852 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 13:40:23.060257 6852 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 13:40:23.060261 6852 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:40:23.060272 6852 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:40:23.060281 6852 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 13:40:23.060282 6852 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 13:40:23.060289 6852 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:40:23.060295 6852 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 13:40:23.060437 6852 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 13:40:23.060484 6852 factory.go:656] Stopping watch factory\\\\nI0930 13:40:23.060509 6852 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:40:23.060510 6852 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:40:23.060532 6852 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:40:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.836027 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.844671 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.844694 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.844702 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.844713 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.844722 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:23Z","lastTransitionTime":"2025-09-30T13:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.855237 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6625580c07bb67ac50a6ffa9e19fff88bb9602760eda24b0affd6579717c7ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.871652 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.883077 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.895819 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.904742 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.917446 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.932865 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.947131 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.947411 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.947509 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.947645 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.947723 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:23Z","lastTransitionTime":"2025-09-30T13:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.951138 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.967039 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:23 crc kubenswrapper[4936]: I0930 13:40:23.989174 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd4dee574c3aee3fe81fee19f41aa90b0a6340eb8677847a2006a1ba906e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:40:18Z\\\",\\\"message\\\":\\\"2025-09-30T13:39:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0b305a56-6fcc-48c8-ba77-2b7cd14c1009\\\\n2025-09-30T13:39:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0b305a56-6fcc-48c8-ba77-2b7cd14c1009 to /host/opt/cni/bin/\\\\n2025-09-30T13:39:33Z [verbose] multus-daemon started\\\\n2025-09-30T13:39:33Z [verbose] Readiness Indicator file check\\\\n2025-09-30T13:40:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:40:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:23Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.026708 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.049645 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.049701 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.049713 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.049732 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.049745 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:24Z","lastTransitionTime":"2025-09-30T13:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.058361 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c637b9dda2a085daa132f4c1e90448e0c4754cd6e62ce335c271f23c993f77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4afd1351049d54e6fc8464c7c3b32bf91337aadd1a1a8832eb9b8d72ff9da06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.076565 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437ce259-4db0-4fa7-add8-5f747c7e7fbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c4ee9d83ec4799fb66b7b47123c77b9dae4dbbd5f06bfda032567297e0939c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08e2b652bd5ea66b08b8186e9e4c204d9bb24e98d561f8410614dcfeaebaac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747d0723207eb1c09444a9cd9f8f52b45b0851c975dfbc81413aafaaa4469fca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.094468 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.151871 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.151916 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.151925 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.151941 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.151953 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:24Z","lastTransitionTime":"2025-09-30T13:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.254316 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.254375 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.254384 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.254399 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.254409 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:24Z","lastTransitionTime":"2025-09-30T13:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.356546 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.356607 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.356624 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.356650 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.356667 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:24Z","lastTransitionTime":"2025-09-30T13:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.459765 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.459815 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.459829 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.459848 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.459861 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:24Z","lastTransitionTime":"2025-09-30T13:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.563451 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.563517 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.563539 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.563565 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.563583 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:24Z","lastTransitionTime":"2025-09-30T13:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.666885 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.666937 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.666955 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.666977 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.666994 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:24Z","lastTransitionTime":"2025-09-30T13:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.770493 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.770557 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.770575 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.770599 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.770620 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:24Z","lastTransitionTime":"2025-09-30T13:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.788859 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vnws_166715eb-a672-4111-b64e-626a0f7b0d74/ovnkube-controller/3.log" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.795456 4936 scope.go:117] "RemoveContainer" containerID="c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35" Sep 30 13:40:24 crc kubenswrapper[4936]: E0930 13:40:24.795655 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.813497 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.848021 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:40:23Z\\\",\\\"message\\\":\\\"39 6852 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 13:40:23.060251 6852 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 13:40:23.060257 6852 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 13:40:23.060261 6852 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:40:23.060272 6852 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:40:23.060281 6852 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 13:40:23.060282 6852 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 13:40:23.060289 6852 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:40:23.060295 6852 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 13:40:23.060437 6852 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 13:40:23.060484 6852 factory.go:656] Stopping watch factory\\\\nI0930 13:40:23.060509 6852 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:40:23.060510 6852 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:40:23.060532 6852 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:40:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.863609 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.874267 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.874326 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.874373 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.874402 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.874420 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:24Z","lastTransitionTime":"2025-09-30T13:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.883247 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6625580c07bb67ac50a6ffa9e19fff88bb9602760eda24b0affd6579717c7ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.901605 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.916715 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.934897 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.950757 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.967188 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.977940 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.978033 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.978088 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.978117 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.978169 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:24Z","lastTransitionTime":"2025-09-30T13:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.984434 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:24 crc kubenswrapper[4936]: I0930 13:40:24.998507 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.013178 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:25Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.028606 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd4dee574c3aee3fe81fee19f41aa90b0a6340eb8677847a2006a1ba906e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:40:18Z\\\",\\\"message\\\":\\\"2025-09-30T13:39:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0b305a56-6fcc-48c8-ba77-2b7cd14c1009\\\\n2025-09-30T13:39:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0b305a56-6fcc-48c8-ba77-2b7cd14c1009 to /host/opt/cni/bin/\\\\n2025-09-30T13:39:33Z [verbose] multus-daemon started\\\\n2025-09-30T13:39:33Z [verbose] Readiness Indicator file check\\\\n2025-09-30T13:40:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:40:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:25Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.046803 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:25Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.065011 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c637b9dda2a085daa132f4c1e90448e0c4754cd6e62ce335c271f23c993f77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4afd1351049d54e6fc8464c7c3b32bf91337aadd1a1a8832eb9b8d72ff9da06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:25Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.079240 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437ce259-4db0-4fa7-add8-5f747c7e7fbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c4ee9d83ec4799fb66b7b47123c77b9dae4dbbd5f06bfda032567297e0939c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08e2b652bd5ea66b08b8186e9e4c204d9bb24e98d561f8410614dcfeaebaac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747d0723207eb1c09444a9cd9f8f52b45b0851c975dfbc81413aafaaa4469fca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:25Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.080168 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.080210 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.080228 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.080248 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.080259 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:25Z","lastTransitionTime":"2025-09-30T13:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.094871 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:25Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.182450 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.182498 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.182516 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.182536 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.182550 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:25Z","lastTransitionTime":"2025-09-30T13:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.285720 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.285770 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.285783 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.285801 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.285814 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:25Z","lastTransitionTime":"2025-09-30T13:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.314963 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.314968 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.315026 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.315094 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:25 crc kubenswrapper[4936]: E0930 13:40:25.315209 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:25 crc kubenswrapper[4936]: E0930 13:40:25.315327 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:25 crc kubenswrapper[4936]: E0930 13:40:25.315518 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:25 crc kubenswrapper[4936]: E0930 13:40:25.315585 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.388468 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.388513 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.388531 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.388552 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.388567 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:25Z","lastTransitionTime":"2025-09-30T13:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.490858 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.490903 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.490913 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.490927 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.490938 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:25Z","lastTransitionTime":"2025-09-30T13:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.594399 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.594476 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.594500 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.594523 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.594538 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:25Z","lastTransitionTime":"2025-09-30T13:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.698501 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.698534 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.698545 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.698569 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.698581 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:25Z","lastTransitionTime":"2025-09-30T13:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.801413 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.801470 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.801488 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.801515 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.801539 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:25Z","lastTransitionTime":"2025-09-30T13:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.903830 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.903875 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.903889 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.903904 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:25 crc kubenswrapper[4936]: I0930 13:40:25.903915 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:25Z","lastTransitionTime":"2025-09-30T13:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.006757 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.006842 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.006864 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.006895 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.006917 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:26Z","lastTransitionTime":"2025-09-30T13:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.110483 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.110533 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.110550 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.110570 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.110584 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:26Z","lastTransitionTime":"2025-09-30T13:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.213934 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.213984 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.214000 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.214020 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.214034 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:26Z","lastTransitionTime":"2025-09-30T13:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.317006 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.317082 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.317100 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.317123 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.317142 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:26Z","lastTransitionTime":"2025-09-30T13:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.421180 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.421218 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.421231 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.421263 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.421276 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:26Z","lastTransitionTime":"2025-09-30T13:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.524651 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.524722 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.524744 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.524834 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.524889 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:26Z","lastTransitionTime":"2025-09-30T13:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.628080 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.628151 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.628166 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.628186 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.628199 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:26Z","lastTransitionTime":"2025-09-30T13:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.731398 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.731447 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.731463 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.732001 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.732018 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:26Z","lastTransitionTime":"2025-09-30T13:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.842452 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.842563 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.842591 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.842623 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.842646 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:26Z","lastTransitionTime":"2025-09-30T13:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.946111 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.946191 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.946211 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.946244 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:26 crc kubenswrapper[4936]: I0930 13:40:26.946261 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:26Z","lastTransitionTime":"2025-09-30T13:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.049125 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.049192 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.049211 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.049236 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.049254 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:27Z","lastTransitionTime":"2025-09-30T13:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.151681 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.151730 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.151741 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.151758 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.151770 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:27Z","lastTransitionTime":"2025-09-30T13:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.254309 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.254375 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.254387 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.254406 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.254416 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:27Z","lastTransitionTime":"2025-09-30T13:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.314749 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.314812 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.314759 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:27 crc kubenswrapper[4936]: E0930 13:40:27.314909 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.314993 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:27 crc kubenswrapper[4936]: E0930 13:40:27.315156 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:27 crc kubenswrapper[4936]: E0930 13:40:27.315292 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:27 crc kubenswrapper[4936]: E0930 13:40:27.315588 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.357697 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.357748 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.357773 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.357789 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.357801 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:27Z","lastTransitionTime":"2025-09-30T13:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.460556 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.460621 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.460639 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.460661 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.460680 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:27Z","lastTransitionTime":"2025-09-30T13:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.563971 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.564014 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.564022 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.564035 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.564045 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:27Z","lastTransitionTime":"2025-09-30T13:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.666759 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.666839 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.666860 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.666887 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.666905 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:27Z","lastTransitionTime":"2025-09-30T13:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.770479 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.770570 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.770582 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.770613 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.770635 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:27Z","lastTransitionTime":"2025-09-30T13:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.873295 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.873385 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.873404 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.873429 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.873445 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:27Z","lastTransitionTime":"2025-09-30T13:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.976532 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.976612 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.976633 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.976674 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:27 crc kubenswrapper[4936]: I0930 13:40:27.976695 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:27Z","lastTransitionTime":"2025-09-30T13:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.079549 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.079587 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.079598 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.079612 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.079623 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:28Z","lastTransitionTime":"2025-09-30T13:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.182091 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.182175 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.182213 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.182234 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.182248 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:28Z","lastTransitionTime":"2025-09-30T13:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.285575 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.285623 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.285636 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.285654 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.285666 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:28Z","lastTransitionTime":"2025-09-30T13:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.387918 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.387955 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.387967 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.387982 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.387994 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:28Z","lastTransitionTime":"2025-09-30T13:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.490425 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.490464 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.490476 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.490493 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.490534 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:28Z","lastTransitionTime":"2025-09-30T13:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.592708 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.592754 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.592763 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.592777 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.592786 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:28Z","lastTransitionTime":"2025-09-30T13:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.696132 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.696187 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.696199 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.696216 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.696228 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:28Z","lastTransitionTime":"2025-09-30T13:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.799441 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.799478 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.799486 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.799499 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.799508 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:28Z","lastTransitionTime":"2025-09-30T13:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.903262 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.903316 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.903328 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.903380 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:28 crc kubenswrapper[4936]: I0930 13:40:28.903393 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:28Z","lastTransitionTime":"2025-09-30T13:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.005967 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.006003 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.006014 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.006031 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.006069 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:29Z","lastTransitionTime":"2025-09-30T13:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.108820 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.108874 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.108887 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.108907 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.108921 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:29Z","lastTransitionTime":"2025-09-30T13:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.212620 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.212665 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.212683 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.212705 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.212721 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:29Z","lastTransitionTime":"2025-09-30T13:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.314408 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.314411 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.314513 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.314534 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:29 crc kubenswrapper[4936]: E0930 13:40:29.314655 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:29 crc kubenswrapper[4936]: E0930 13:40:29.314800 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:29 crc kubenswrapper[4936]: E0930 13:40:29.314919 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:29 crc kubenswrapper[4936]: E0930 13:40:29.314979 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.315157 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.315195 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.315210 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.315227 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.315242 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:29Z","lastTransitionTime":"2025-09-30T13:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.417999 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.418055 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.418074 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.418098 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.418116 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:29Z","lastTransitionTime":"2025-09-30T13:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.520117 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.520659 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.520683 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.520712 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.520735 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:29Z","lastTransitionTime":"2025-09-30T13:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.623243 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.623310 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.623322 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.623361 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.623375 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:29Z","lastTransitionTime":"2025-09-30T13:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.725809 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.725919 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.725940 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.725967 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.725985 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:29Z","lastTransitionTime":"2025-09-30T13:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.828917 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.829002 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.829015 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.829034 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.829046 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:29Z","lastTransitionTime":"2025-09-30T13:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.932080 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.932120 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.932137 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.932159 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:29 crc kubenswrapper[4936]: I0930 13:40:29.932175 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:29Z","lastTransitionTime":"2025-09-30T13:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.035718 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.035762 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.035774 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.035789 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.035798 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:30Z","lastTransitionTime":"2025-09-30T13:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.139304 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.139420 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.139444 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.139472 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.139487 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:30Z","lastTransitionTime":"2025-09-30T13:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.241786 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.241826 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.241863 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.241894 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.241912 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:30Z","lastTransitionTime":"2025-09-30T13:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.333447 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:30Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.343730 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.343766 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.343777 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.343793 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.343805 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:30Z","lastTransitionTime":"2025-09-30T13:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.348412 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:30Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.363857 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd4dee574c3aee3fe81fee19f41aa90b0a6340eb8677847a2006a1ba906e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:40:18Z\\\",\\\"message\\\":\\\"2025-09-30T13:39:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0b305a56-6fcc-48c8-ba77-2b7cd14c1009\\\\n2025-09-30T13:39:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0b305a56-6fcc-48c8-ba77-2b7cd14c1009 to /host/opt/cni/bin/\\\\n2025-09-30T13:39:33Z [verbose] multus-daemon started\\\\n2025-09-30T13:39:33Z [verbose] Readiness Indicator file check\\\\n2025-09-30T13:40:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:40:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:30Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.382124 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:30Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.399845 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c637b9dda2a085daa132f4c1e90448e0c4754cd6e62ce335c271f23c993f77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4afd1351049d54e6fc8464c7c3b32bf91337aadd1a1a8832eb9b8d72ff9da06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:30Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.416542 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437ce259-4db0-4fa7-add8-5f747c7e7fbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c4ee9d83ec4799fb66b7b47123c77b9dae4dbbd5f06bfda032567297e0939c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08e2b652bd5ea66b08b8186e9e4c204d9bb24e98d561f8410614dcfeaebaac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747d0723207eb1c09444a9cd9f8f52b45b0851c975dfbc81413aafaaa4469fca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:30Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.430460 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:30Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.442033 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:30Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.445873 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.445906 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.445917 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.445934 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.445945 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:30Z","lastTransitionTime":"2025-09-30T13:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.462562 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:40:23Z\\\",\\\"message\\\":\\\"39 6852 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 13:40:23.060251 6852 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 13:40:23.060257 6852 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 13:40:23.060261 6852 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:40:23.060272 6852 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:40:23.060281 6852 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 13:40:23.060282 6852 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 13:40:23.060289 6852 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:40:23.060295 6852 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 13:40:23.060437 6852 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 13:40:23.060484 6852 factory.go:656] Stopping watch factory\\\\nI0930 13:40:23.060509 6852 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:40:23.060510 6852 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:40:23.060532 6852 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:40:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:30Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.473857 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:30Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.487962 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6625580c07bb67ac50a6ffa9e19fff88bb9602760eda24b0affd6579717c7ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:30Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.502074 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:30Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.515705 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:30Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.530647 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:30Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.543821 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:30Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.548601 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.548656 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.548670 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.548689 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.548705 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:30Z","lastTransitionTime":"2025-09-30T13:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.559989 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:30Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.575704 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:30Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.651439 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.651492 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.651509 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.651533 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.651554 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:30Z","lastTransitionTime":"2025-09-30T13:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.754261 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.754305 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.754317 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.754361 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.754374 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:30Z","lastTransitionTime":"2025-09-30T13:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.857202 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.857274 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.857288 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.857304 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.857317 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:30Z","lastTransitionTime":"2025-09-30T13:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.959667 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.959737 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.959751 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.959803 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:30 crc kubenswrapper[4936]: I0930 13:40:30.959819 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:30Z","lastTransitionTime":"2025-09-30T13:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.062634 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.062663 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.062671 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.062685 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.062696 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:31Z","lastTransitionTime":"2025-09-30T13:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.164490 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.164539 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.164551 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.164570 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.164584 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:31Z","lastTransitionTime":"2025-09-30T13:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.267680 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.267752 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.267770 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.267793 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.267808 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:31Z","lastTransitionTime":"2025-09-30T13:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.315301 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.315514 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.315598 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:31 crc kubenswrapper[4936]: E0930 13:40:31.315591 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:31 crc kubenswrapper[4936]: E0930 13:40:31.315751 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.315737 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:31 crc kubenswrapper[4936]: E0930 13:40:31.315854 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:31 crc kubenswrapper[4936]: E0930 13:40:31.316110 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.371636 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.371710 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.371736 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.371764 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.371782 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:31Z","lastTransitionTime":"2025-09-30T13:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.474565 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.474653 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.474673 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.474695 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.474711 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:31Z","lastTransitionTime":"2025-09-30T13:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.577768 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.577799 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.577810 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.577826 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.577838 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:31Z","lastTransitionTime":"2025-09-30T13:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.682845 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.682890 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.682904 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.682924 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.682938 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:31Z","lastTransitionTime":"2025-09-30T13:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.785456 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.785514 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.785526 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.785541 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.785552 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:31Z","lastTransitionTime":"2025-09-30T13:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.863269 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.863304 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.863315 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.863330 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.863363 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:31Z","lastTransitionTime":"2025-09-30T13:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:31 crc kubenswrapper[4936]: E0930 13:40:31.879575 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:31Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.884007 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.884078 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.884103 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.884132 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.884159 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:31Z","lastTransitionTime":"2025-09-30T13:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:31 crc kubenswrapper[4936]: E0930 13:40:31.899279 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:31Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.904066 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.904102 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.904114 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.904133 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.904145 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:31Z","lastTransitionTime":"2025-09-30T13:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:31 crc kubenswrapper[4936]: E0930 13:40:31.919944 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:31Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.924097 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.924260 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.924281 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.924299 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.924311 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:31Z","lastTransitionTime":"2025-09-30T13:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:31 crc kubenswrapper[4936]: E0930 13:40:31.939573 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:31Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.943288 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.943475 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.943542 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.943604 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.943671 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:31Z","lastTransitionTime":"2025-09-30T13:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:31 crc kubenswrapper[4936]: E0930 13:40:31.956214 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:31Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:31 crc kubenswrapper[4936]: E0930 13:40:31.956533 4936 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.958106 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.958152 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.958169 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.958189 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:31 crc kubenswrapper[4936]: I0930 13:40:31.958205 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:31Z","lastTransitionTime":"2025-09-30T13:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.061858 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.061954 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.061977 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.062035 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.062056 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:32Z","lastTransitionTime":"2025-09-30T13:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.165077 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.165206 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.165234 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.165265 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.165308 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:32Z","lastTransitionTime":"2025-09-30T13:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.268383 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.268441 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.268457 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.268476 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.268494 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:32Z","lastTransitionTime":"2025-09-30T13:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.370833 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.370871 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.370882 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.370899 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.370910 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:32Z","lastTransitionTime":"2025-09-30T13:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.473727 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.473790 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.473802 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.473846 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.473858 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:32Z","lastTransitionTime":"2025-09-30T13:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.575928 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.576171 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.576290 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.576392 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.576457 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:32Z","lastTransitionTime":"2025-09-30T13:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.678572 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.678636 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.678653 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.678679 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.678696 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:32Z","lastTransitionTime":"2025-09-30T13:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.780520 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.780550 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.780559 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.780571 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.780579 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:32Z","lastTransitionTime":"2025-09-30T13:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.883132 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.884286 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.884412 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.884500 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.884586 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:32Z","lastTransitionTime":"2025-09-30T13:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.988319 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.988397 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.988410 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.988426 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:32 crc kubenswrapper[4936]: I0930 13:40:32.988924 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:32Z","lastTransitionTime":"2025-09-30T13:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.091116 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.091151 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.091187 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.091199 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.091209 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:33Z","lastTransitionTime":"2025-09-30T13:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.199176 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.199224 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.199241 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.199263 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.199281 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:33Z","lastTransitionTime":"2025-09-30T13:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.302974 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.303039 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.303054 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.303077 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.303091 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:33Z","lastTransitionTime":"2025-09-30T13:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.314476 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.314660 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.314520 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:33 crc kubenswrapper[4936]: E0930 13:40:33.314936 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:33 crc kubenswrapper[4936]: E0930 13:40:33.315168 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.315402 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:33 crc kubenswrapper[4936]: E0930 13:40:33.315612 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:33 crc kubenswrapper[4936]: E0930 13:40:33.315780 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.405819 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.405875 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.405885 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.405902 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.405913 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:33Z","lastTransitionTime":"2025-09-30T13:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.509593 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.509621 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.509630 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.509643 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.509651 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:33Z","lastTransitionTime":"2025-09-30T13:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.613607 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.613892 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.614026 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.614163 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.614308 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:33Z","lastTransitionTime":"2025-09-30T13:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.717413 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.717744 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.717811 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.717892 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.717964 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:33Z","lastTransitionTime":"2025-09-30T13:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.820858 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.820920 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.820942 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.820969 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.820991 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:33Z","lastTransitionTime":"2025-09-30T13:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.923445 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.924040 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.924126 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.924211 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:33 crc kubenswrapper[4936]: I0930 13:40:33.924288 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:33Z","lastTransitionTime":"2025-09-30T13:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.026977 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.027073 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.027093 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.027271 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.027302 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:34Z","lastTransitionTime":"2025-09-30T13:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.130314 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.130397 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.130407 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.130423 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.130433 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:34Z","lastTransitionTime":"2025-09-30T13:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.233138 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.233177 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.233273 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.233290 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.233313 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:34Z","lastTransitionTime":"2025-09-30T13:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.335767 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.335816 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.335832 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.335855 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.335870 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:34Z","lastTransitionTime":"2025-09-30T13:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.438526 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.438557 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.438565 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.438578 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.438586 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:34Z","lastTransitionTime":"2025-09-30T13:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.540470 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.540503 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.540512 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.540525 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.540533 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:34Z","lastTransitionTime":"2025-09-30T13:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.642379 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.642448 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.642462 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.642480 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.642492 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:34Z","lastTransitionTime":"2025-09-30T13:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.745043 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.745400 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.745413 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.745428 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.745438 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:34Z","lastTransitionTime":"2025-09-30T13:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.847779 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.847826 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.847841 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.847860 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.847872 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:34Z","lastTransitionTime":"2025-09-30T13:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.946771 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.946826 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.946859 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.946876 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:34 crc kubenswrapper[4936]: E0930 13:40:34.946931 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:40:34 crc kubenswrapper[4936]: E0930 13:40:34.946954 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:40:34 crc kubenswrapper[4936]: E0930 13:40:34.946968 4936 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:40:34 crc kubenswrapper[4936]: E0930 13:40:34.946977 4936 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:40:34 crc kubenswrapper[4936]: E0930 13:40:34.946996 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:40:34 crc kubenswrapper[4936]: E0930 13:40:34.947017 4936 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:40:34 crc kubenswrapper[4936]: E0930 13:40:34.947028 4936 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:40:34 crc kubenswrapper[4936]: E0930 13:40:34.947018 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:41:38.947001404 +0000 UTC m=+149.331003705 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:40:34 crc kubenswrapper[4936]: E0930 13:40:34.947059 4936 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:40:34 crc kubenswrapper[4936]: E0930 13:40:34.947069 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:41:38.947059056 +0000 UTC m=+149.331061357 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:40:34 crc kubenswrapper[4936]: E0930 13:40:34.947162 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:41:38.947150489 +0000 UTC m=+149.331152790 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:40:34 crc kubenswrapper[4936]: E0930 13:40:34.947198 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:41:38.9471924 +0000 UTC m=+149.331194701 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.949929 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.949956 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.949967 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.949981 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:34 crc kubenswrapper[4936]: I0930 13:40:34.949992 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:34Z","lastTransitionTime":"2025-09-30T13:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.047712 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:40:35 crc kubenswrapper[4936]: E0930 13:40:35.047890 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:39.047874595 +0000 UTC m=+149.431876896 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.052835 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.053078 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.053155 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.053282 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.053450 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:35Z","lastTransitionTime":"2025-09-30T13:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.155915 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.155959 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.155972 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.155988 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.156321 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:35Z","lastTransitionTime":"2025-09-30T13:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.258460 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.258496 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.258506 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.258598 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.258619 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:35Z","lastTransitionTime":"2025-09-30T13:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.314666 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.314708 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.314666 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:35 crc kubenswrapper[4936]: E0930 13:40:35.314793 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.314949 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:35 crc kubenswrapper[4936]: E0930 13:40:35.314948 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:35 crc kubenswrapper[4936]: E0930 13:40:35.315000 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:35 crc kubenswrapper[4936]: E0930 13:40:35.315044 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.360149 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.360179 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.360191 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.360207 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.360219 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:35Z","lastTransitionTime":"2025-09-30T13:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.462406 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.462439 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.462450 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.462465 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.462473 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:35Z","lastTransitionTime":"2025-09-30T13:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.564828 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.564892 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.564916 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.564944 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.564966 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:35Z","lastTransitionTime":"2025-09-30T13:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.667193 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.667243 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.667258 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.667279 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.667294 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:35Z","lastTransitionTime":"2025-09-30T13:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.768761 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.768791 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.768800 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.768813 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.768821 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:35Z","lastTransitionTime":"2025-09-30T13:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.871034 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.871061 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.871073 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.871110 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.871121 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:35Z","lastTransitionTime":"2025-09-30T13:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.973650 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.973692 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.973703 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.973720 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:35 crc kubenswrapper[4936]: I0930 13:40:35.973732 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:35Z","lastTransitionTime":"2025-09-30T13:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.075943 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.075995 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.076004 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.076016 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.076024 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:36Z","lastTransitionTime":"2025-09-30T13:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.178327 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.178405 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.178416 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.178436 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.178451 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:36Z","lastTransitionTime":"2025-09-30T13:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.281048 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.281083 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.281094 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.281109 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.281119 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:36Z","lastTransitionTime":"2025-09-30T13:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.383530 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.383581 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.383592 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.383644 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.383655 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:36Z","lastTransitionTime":"2025-09-30T13:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.485362 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.485411 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.485423 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.485440 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.485453 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:36Z","lastTransitionTime":"2025-09-30T13:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.588435 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.588486 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.588502 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.588521 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.588533 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:36Z","lastTransitionTime":"2025-09-30T13:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.691020 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.691057 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.691068 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.691082 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.691092 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:36Z","lastTransitionTime":"2025-09-30T13:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.792806 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.793078 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.793150 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.793264 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.793353 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:36Z","lastTransitionTime":"2025-09-30T13:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.895174 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.895214 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.895228 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.895264 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.895276 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:36Z","lastTransitionTime":"2025-09-30T13:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.998102 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.998466 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.998493 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.998518 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:36 crc kubenswrapper[4936]: I0930 13:40:36.998532 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:36Z","lastTransitionTime":"2025-09-30T13:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.100544 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.100589 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.100613 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.100628 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.100636 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:37Z","lastTransitionTime":"2025-09-30T13:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.202273 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.202320 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.202328 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.202362 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.202370 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:37Z","lastTransitionTime":"2025-09-30T13:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.305086 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.305122 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.305130 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.305144 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.305153 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:37Z","lastTransitionTime":"2025-09-30T13:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.314774 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:37 crc kubenswrapper[4936]: E0930 13:40:37.314984 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.315527 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.315558 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.315666 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:37 crc kubenswrapper[4936]: E0930 13:40:37.315768 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:37 crc kubenswrapper[4936]: E0930 13:40:37.315915 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:37 crc kubenswrapper[4936]: E0930 13:40:37.316017 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.329817 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.407558 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.408012 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.408305 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.408535 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.408860 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:37Z","lastTransitionTime":"2025-09-30T13:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.511397 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.511664 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.511752 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.511829 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.511920 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:37Z","lastTransitionTime":"2025-09-30T13:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.614843 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.614915 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.614933 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.614959 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.614976 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:37Z","lastTransitionTime":"2025-09-30T13:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.717387 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.717450 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.717467 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.717502 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.717521 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:37Z","lastTransitionTime":"2025-09-30T13:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.820243 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.820289 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.820302 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.820320 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.820354 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:37Z","lastTransitionTime":"2025-09-30T13:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.923017 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.923059 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.923070 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.923087 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:37 crc kubenswrapper[4936]: I0930 13:40:37.923100 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:37Z","lastTransitionTime":"2025-09-30T13:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.025555 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.025631 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.025650 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.025673 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.025690 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:38Z","lastTransitionTime":"2025-09-30T13:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.129035 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.129093 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.129109 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.129130 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.129147 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:38Z","lastTransitionTime":"2025-09-30T13:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.231916 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.231978 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.231997 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.232025 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.232043 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:38Z","lastTransitionTime":"2025-09-30T13:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.334021 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.335030 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.335182 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.335282 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.335444 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.335574 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:38Z","lastTransitionTime":"2025-09-30T13:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.438289 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.438382 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.438406 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.438435 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.438459 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:38Z","lastTransitionTime":"2025-09-30T13:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.541468 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.541506 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.541514 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.541529 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.541539 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:38Z","lastTransitionTime":"2025-09-30T13:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.644470 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.644512 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.644523 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.644539 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.644550 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:38Z","lastTransitionTime":"2025-09-30T13:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.747760 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.747834 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.747860 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.747887 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.747915 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:38Z","lastTransitionTime":"2025-09-30T13:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.850038 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.850110 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.850134 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.850162 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.850184 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:38Z","lastTransitionTime":"2025-09-30T13:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.952358 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.952386 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.952393 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.952405 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:38 crc kubenswrapper[4936]: I0930 13:40:38.952413 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:38Z","lastTransitionTime":"2025-09-30T13:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.054318 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.054846 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.054920 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.054992 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.055057 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:39Z","lastTransitionTime":"2025-09-30T13:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.157356 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.157394 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.157404 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.157418 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.157466 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:39Z","lastTransitionTime":"2025-09-30T13:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.259738 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.259773 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.259783 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.259797 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.259806 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:39Z","lastTransitionTime":"2025-09-30T13:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.314814 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.314848 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.314866 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.314901 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:39 crc kubenswrapper[4936]: E0930 13:40:39.315397 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:39 crc kubenswrapper[4936]: E0930 13:40:39.315555 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:39 crc kubenswrapper[4936]: E0930 13:40:39.315684 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:39 crc kubenswrapper[4936]: E0930 13:40:39.315702 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.315861 4936 scope.go:117] "RemoveContainer" containerID="c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35" Sep 30 13:40:39 crc kubenswrapper[4936]: E0930 13:40:39.316076 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.361639 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.361687 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.361709 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.361725 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.361737 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:39Z","lastTransitionTime":"2025-09-30T13:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.463242 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.463281 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.463290 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.463302 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.463312 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:39Z","lastTransitionTime":"2025-09-30T13:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.566097 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.566174 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.566196 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.566223 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.566245 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:39Z","lastTransitionTime":"2025-09-30T13:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.668756 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.668794 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.668804 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.668818 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.668828 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:39Z","lastTransitionTime":"2025-09-30T13:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.771207 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.771244 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.771253 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.771285 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.771297 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:39Z","lastTransitionTime":"2025-09-30T13:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.872969 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.873017 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.873028 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.873041 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.873051 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:39Z","lastTransitionTime":"2025-09-30T13:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.976028 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.976072 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.976082 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.976096 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:39 crc kubenswrapper[4936]: I0930 13:40:39.976106 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:39Z","lastTransitionTime":"2025-09-30T13:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.078729 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.078781 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.078796 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.078817 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.078832 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:40Z","lastTransitionTime":"2025-09-30T13:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.180985 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.181057 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.181065 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.181077 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.181086 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:40Z","lastTransitionTime":"2025-09-30T13:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.283312 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.283360 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.283370 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.283381 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.283390 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:40Z","lastTransitionTime":"2025-09-30T13:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.328071 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fffa25-ec6d-4d08-a071-0bbcc613227e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cfca0b1bd5481bea13af586db630f1141ff976d3da01212a59f178d8293bc1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36f44aca3adbafbea2bb815782511bea9e78b4524a31d0b32749d66eda666c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f44aca3adbafbea2bb815782511bea9e78b4524a31d0b32749d66eda666c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.339308 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.350840 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550e70d18205716e07fb66558de1abc4cd92973d4394211e534e210e2a04aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://197975715d598750921c8137ab39d0c96ed43794c86acf65ad9561abf3470d80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.361920 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.370764 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.385241 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.385279 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.385289 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.385304 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.385317 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:40Z","lastTransitionTime":"2025-09-30T13:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.388649 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45687651-08a1-4fe9-9f81-bc00715f14ad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f9a4b004a24ae6839d938f2bd246598c885fc558bfba73f45e9788aedd6a348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ff5042a7d61e0e4577b29599f839c2daf706edebfb291251517082ad13413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8e08c20275c5eb8fb4bd360ab1115014050fe2bd9b8dfaac407f8ab9aa115c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34790420f2fbc84f8e127a0d53172ba0b739fe72c02abc09b18cdb3780469ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b646312f80abd4a3f21dcd0ae1484aad140291ffc1c18f7e22a8e5ac0e013a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd89827b85338f2e1555b12d4fd87323ab5dd62b690f2fb36753e521131cb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd89827b85338f2e1555b12d4fd87323ab5dd62b690f2fb36753e521131cb42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8a7197cef2ee198fbb5ad5cea725bf0171b8ff2ea8e2f7d3a398b7bf630986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8a7197cef2ee198fbb5ad5cea725bf0171b8ff2ea8e2f7d3a398b7bf630986b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fee4d40a61fe8d2c1c53e3c99cb0fe8902aecf0c55a643aed7b9cde2b4678948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fee4d40a61fe8d2c1c53e3c99cb0fe8902aecf0c55a643aed7b9cde2b4678948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.401230 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7d7b2a1-8f90-4c9f-b5cd-7e6b48aa6175\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0af36b4053354fe698a73ea42d961f02645dd1641b0cbdb3baaa18322c4fc2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633cab7086da2df3b5029db07521445affa1a8a4dc24561ebcb2fe4824e6622e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e32083bfe28cd2e62d0511f11e5dfe0246a05fbca13d5da192262de2378c05c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d56238bc932c179f84a5848539abf22aeb6818d6fc2f06cc04c024f10aa2150\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.411707 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad749945dbee8cf4e7c718747e08516379018f716ee235e0582d23879d638cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.421113 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"437ce259-4db0-4fa7-add8-5f747c7e7fbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c4ee9d83ec4799fb66b7b47123c77b9dae4dbbd5f06bfda032567297e0939c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08e2b652bd5ea66b08b8186e9e4c204d9bb24e98d561f8410614dcfeaebaac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747d0723207eb1c09444a9cd9f8f52b45b0851c975dfbc81413aafaaa4469fca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d7855f1d5724f1bf147b46c884d2eaa8361d283e73f03084b5430644f0b0ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.430214 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.437894 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zj44" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa0af778-18be-4c3d-aa1a-0e15485c2aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbc049b3d44764ae6077c2ec8de2fcc5de7c4bce17c7b3150aa11975c13001b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fnlqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zj44\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.449662 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vxjrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dbb1e3f-927e-4587-835e-b21370b33262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2dd4dee574c3aee3fe81fee19f41aa90b0a6340eb8677847a2006a1ba906e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:40:18Z\\\",\\\"message\\\":\\\"2025-09-30T13:39:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0b305a56-6fcc-48c8-ba77-2b7cd14c1009\\\\n2025-09-30T13:39:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0b305a56-6fcc-48c8-ba77-2b7cd14c1009 to /host/opt/cni/bin/\\\\n2025-09-30T13:39:33Z [verbose] multus-daemon started\\\\n2025-09-30T13:39:33Z [verbose] Readiness Indicator file check\\\\n2025-09-30T13:40:18Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:40:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwk7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vxjrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.462608 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd1cdd6-e1f0-4750-849e-8e22bbc7fedf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63c4e694f67a9f9634e94f0c8e7b143200e86d274ff50c65ce3dc443c0a11ec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16adc2c27cc1fddcd6823ea77e1de06802a4a39d0e55a06895f29dd6b85f8e70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b36833a8327f0e0a7707dd28c5633275fed2549372a35085994d2b46584aacd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dc359947e899e21364c2ddcfc771dde5163c88e86d54f85071ba4c46b6a265d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d092d2a2099a8f6c159c9980918128bf5a8b5db902dcd10ce21b74cae430bbe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45dac6b56d07bb317eee8c7d8fd00a8b0d2a40eddea46cfec78cfdcb2d1d273b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c2da0a52f2feab9a7346428ea1c88e3c0dba4e57f9b9eac357c387d284d00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fcsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jzqxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.472390 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f97225f9-be97-4bc9-841b-fc96e4a8be4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c637b9dda2a085daa132f4c1e90448e0c4754cd6e62ce335c271f23c993f77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4afd1351049d54e6fc8464c7c3b32bf91337aadd1a1a8832eb9b8d72ff9da06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxcgz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xwqb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.484735 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6625580c07bb67ac50a6ffa9e19fff88bb9602760eda24b0affd6579717c7ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.487605 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.487640 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.487656 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.487671 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.487680 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:40Z","lastTransitionTime":"2025-09-30T13:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.496715 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.507845 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.524032 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:40:23Z\\\",\\\"message\\\":\\\"39 6852 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 13:40:23.060251 6852 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 13:40:23.060257 6852 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 13:40:23.060261 6852 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:40:23.060272 6852 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:40:23.060281 6852 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 13:40:23.060282 6852 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 13:40:23.060289 6852 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:40:23.060295 6852 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 13:40:23.060437 6852 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 13:40:23.060484 6852 factory.go:656] Stopping watch factory\\\\nI0930 13:40:23.060509 6852 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:40:23.060510 6852 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:40:23.060532 6852 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:40:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.533577 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:40Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.590658 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.590701 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.590710 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.590725 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.590734 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:40Z","lastTransitionTime":"2025-09-30T13:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.695031 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.695183 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.695210 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.695299 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.695888 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:40Z","lastTransitionTime":"2025-09-30T13:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.798044 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.798080 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.798088 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.798100 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.798108 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:40Z","lastTransitionTime":"2025-09-30T13:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.900119 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.900159 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.900170 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.900184 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:40 crc kubenswrapper[4936]: I0930 13:40:40.900194 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:40Z","lastTransitionTime":"2025-09-30T13:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.002513 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.002557 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.002568 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.002584 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.002595 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:41Z","lastTransitionTime":"2025-09-30T13:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.105067 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.105106 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.105117 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.105135 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.105147 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:41Z","lastTransitionTime":"2025-09-30T13:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.207534 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.207581 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.207594 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.207612 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.207624 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:41Z","lastTransitionTime":"2025-09-30T13:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.310186 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.310223 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.310231 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.310247 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.310257 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:41Z","lastTransitionTime":"2025-09-30T13:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.314518 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.314539 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:41 crc kubenswrapper[4936]: E0930 13:40:41.314610 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.314638 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.314672 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:41 crc kubenswrapper[4936]: E0930 13:40:41.314777 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:41 crc kubenswrapper[4936]: E0930 13:40:41.314804 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:41 crc kubenswrapper[4936]: E0930 13:40:41.314870 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.412583 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.412621 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.412632 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.412647 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.412658 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:41Z","lastTransitionTime":"2025-09-30T13:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.515008 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.515072 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.515084 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.515099 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.515111 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:41Z","lastTransitionTime":"2025-09-30T13:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.617242 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.617277 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.617285 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.617298 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.617307 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:41Z","lastTransitionTime":"2025-09-30T13:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.718947 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.718971 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.718979 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.718991 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.719000 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:41Z","lastTransitionTime":"2025-09-30T13:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.821080 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.821121 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.821130 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.821144 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.821154 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:41Z","lastTransitionTime":"2025-09-30T13:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.923156 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.923194 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.923202 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.923217 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:41 crc kubenswrapper[4936]: I0930 13:40:41.923225 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:41Z","lastTransitionTime":"2025-09-30T13:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.014148 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.014182 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.014193 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.014206 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.014214 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:42Z","lastTransitionTime":"2025-09-30T13:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:42 crc kubenswrapper[4936]: E0930 13:40:42.032082 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.036101 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.036126 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.036134 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.036146 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.036157 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:42Z","lastTransitionTime":"2025-09-30T13:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:42 crc kubenswrapper[4936]: E0930 13:40:42.053297 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.056879 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.056906 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.056914 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.056929 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.056939 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:42Z","lastTransitionTime":"2025-09-30T13:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:42 crc kubenswrapper[4936]: E0930 13:40:42.067510 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.071807 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.071863 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.071881 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.071905 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.071924 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:42Z","lastTransitionTime":"2025-09-30T13:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:42 crc kubenswrapper[4936]: E0930 13:40:42.084731 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.088074 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.088106 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.088117 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.088133 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.088145 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:42Z","lastTransitionTime":"2025-09-30T13:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:42 crc kubenswrapper[4936]: E0930 13:40:42.099903 4936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8be31134-e63d-454e-b952-15f6f996f2b7\\\",\\\"systemUUID\\\":\\\"db7ffece-d862-468a-996c-f544c38024fc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:42 crc kubenswrapper[4936]: E0930 13:40:42.100078 4936 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.101373 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.101429 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.101444 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.101461 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.101475 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:42Z","lastTransitionTime":"2025-09-30T13:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.203314 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.203371 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.203410 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.203423 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.203432 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:42Z","lastTransitionTime":"2025-09-30T13:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.305562 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.305592 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.305605 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.305618 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.305628 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:42Z","lastTransitionTime":"2025-09-30T13:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.408304 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.408424 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.408455 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.408484 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.408509 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:42Z","lastTransitionTime":"2025-09-30T13:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.510729 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.510801 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.510819 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.510845 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.510865 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:42Z","lastTransitionTime":"2025-09-30T13:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.613232 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.613275 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.613287 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.613304 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.613315 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:42Z","lastTransitionTime":"2025-09-30T13:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.716109 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.716144 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.716153 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.716167 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.716176 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:42Z","lastTransitionTime":"2025-09-30T13:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.819086 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.819132 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.819144 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.819163 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.819179 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:42Z","lastTransitionTime":"2025-09-30T13:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.922082 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.922123 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.922133 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.922147 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:42 crc kubenswrapper[4936]: I0930 13:40:42.922157 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:42Z","lastTransitionTime":"2025-09-30T13:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.033876 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.034125 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.034187 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.034296 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.034383 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:43Z","lastTransitionTime":"2025-09-30T13:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.137810 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.137874 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.137892 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.137918 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.137936 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:43Z","lastTransitionTime":"2025-09-30T13:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.240524 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.240572 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.240590 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.240639 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.240656 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:43Z","lastTransitionTime":"2025-09-30T13:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.314628 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:43 crc kubenswrapper[4936]: E0930 13:40:43.314763 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.315077 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.315196 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.315200 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:43 crc kubenswrapper[4936]: E0930 13:40:43.315319 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:43 crc kubenswrapper[4936]: E0930 13:40:43.315524 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:43 crc kubenswrapper[4936]: E0930 13:40:43.315627 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.343772 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.343840 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.343856 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.343880 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.343902 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:43Z","lastTransitionTime":"2025-09-30T13:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.446482 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.446513 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.446548 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.446566 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.446576 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:43Z","lastTransitionTime":"2025-09-30T13:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.549811 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.549897 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.549922 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.549939 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.549953 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:43Z","lastTransitionTime":"2025-09-30T13:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.653228 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.653264 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.653276 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.653316 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.653329 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:43Z","lastTransitionTime":"2025-09-30T13:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.755746 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.756099 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.756191 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.756286 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.756406 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:43Z","lastTransitionTime":"2025-09-30T13:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.858254 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.858282 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.858290 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.858302 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.858311 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:43Z","lastTransitionTime":"2025-09-30T13:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.961152 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.961187 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.961197 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.961211 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:43 crc kubenswrapper[4936]: I0930 13:40:43.961222 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:43Z","lastTransitionTime":"2025-09-30T13:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.063563 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.063607 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.063620 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.063638 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.063650 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:44Z","lastTransitionTime":"2025-09-30T13:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.166847 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.166921 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.166938 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.166961 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.166979 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:44Z","lastTransitionTime":"2025-09-30T13:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.269894 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.269948 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.269964 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.269987 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.269999 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:44Z","lastTransitionTime":"2025-09-30T13:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.372018 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.372069 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.372085 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.372106 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.372125 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:44Z","lastTransitionTime":"2025-09-30T13:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.475143 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.475180 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.475189 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.475204 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.475216 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:44Z","lastTransitionTime":"2025-09-30T13:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.577159 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.577197 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.577207 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.577222 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.577230 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:44Z","lastTransitionTime":"2025-09-30T13:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.681679 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.681712 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.681723 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.681737 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.681746 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:44Z","lastTransitionTime":"2025-09-30T13:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.784755 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.784891 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.784915 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.784955 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.784986 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:44Z","lastTransitionTime":"2025-09-30T13:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.889122 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.889180 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.889195 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.889216 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.889238 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:44Z","lastTransitionTime":"2025-09-30T13:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.991726 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.991761 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.991770 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.991784 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:44 crc kubenswrapper[4936]: I0930 13:40:44.991792 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:44Z","lastTransitionTime":"2025-09-30T13:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.095220 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.095315 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.095381 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.095415 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.095434 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:45Z","lastTransitionTime":"2025-09-30T13:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.198598 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.198654 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.198667 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.198683 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.198696 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:45Z","lastTransitionTime":"2025-09-30T13:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.301578 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.301637 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.301650 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.301670 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.301681 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:45Z","lastTransitionTime":"2025-09-30T13:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.315181 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.315247 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.315252 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.315194 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:45 crc kubenswrapper[4936]: E0930 13:40:45.315539 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:45 crc kubenswrapper[4936]: E0930 13:40:45.315997 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:45 crc kubenswrapper[4936]: E0930 13:40:45.316113 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:45 crc kubenswrapper[4936]: E0930 13:40:45.316198 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.403800 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.403841 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.403852 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.403866 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.403877 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:45Z","lastTransitionTime":"2025-09-30T13:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.506165 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.506208 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.506219 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.506233 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.506243 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:45Z","lastTransitionTime":"2025-09-30T13:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.609212 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.609448 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.609473 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.609495 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.609514 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:45Z","lastTransitionTime":"2025-09-30T13:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.712730 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.712792 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.712807 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.712828 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.712843 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:45Z","lastTransitionTime":"2025-09-30T13:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.817070 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.817118 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.817130 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.817147 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.817160 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:45Z","lastTransitionTime":"2025-09-30T13:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.919884 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.919944 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.919960 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.919986 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:45 crc kubenswrapper[4936]: I0930 13:40:45.920004 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:45Z","lastTransitionTime":"2025-09-30T13:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.021984 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.022067 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.022077 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.022094 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.022106 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:46Z","lastTransitionTime":"2025-09-30T13:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.125295 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.125419 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.125445 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.125476 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.125493 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:46Z","lastTransitionTime":"2025-09-30T13:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.227961 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.227998 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.228007 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.228020 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.228056 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:46Z","lastTransitionTime":"2025-09-30T13:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.330103 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.330139 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.330151 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.330170 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.330180 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:46Z","lastTransitionTime":"2025-09-30T13:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.433196 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.433232 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.433242 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.433256 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.433266 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:46Z","lastTransitionTime":"2025-09-30T13:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.535375 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.535440 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.535467 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.535488 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.535503 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:46Z","lastTransitionTime":"2025-09-30T13:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.637830 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.637861 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.637871 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.637886 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.637897 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:46Z","lastTransitionTime":"2025-09-30T13:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.739428 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.739715 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.739826 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.739914 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.739998 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:46Z","lastTransitionTime":"2025-09-30T13:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.842435 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.842477 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.842486 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.842498 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.842515 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:46Z","lastTransitionTime":"2025-09-30T13:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.944450 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.944698 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.944971 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.945179 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:46 crc kubenswrapper[4936]: I0930 13:40:46.945267 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:46Z","lastTransitionTime":"2025-09-30T13:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.047689 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.047932 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.048001 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.048069 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.048127 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:47Z","lastTransitionTime":"2025-09-30T13:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.150949 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.151001 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.151015 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.151034 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.151055 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:47Z","lastTransitionTime":"2025-09-30T13:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.254148 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.254186 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.254195 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.254209 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.254219 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:47Z","lastTransitionTime":"2025-09-30T13:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.315222 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.315293 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.315235 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.315309 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:47 crc kubenswrapper[4936]: E0930 13:40:47.315415 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:47 crc kubenswrapper[4936]: E0930 13:40:47.315566 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:47 crc kubenswrapper[4936]: E0930 13:40:47.315837 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:47 crc kubenswrapper[4936]: E0930 13:40:47.315887 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.356545 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.356595 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.356606 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.356638 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.356651 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:47Z","lastTransitionTime":"2025-09-30T13:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.460196 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.460251 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.460270 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.460292 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.460311 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:47Z","lastTransitionTime":"2025-09-30T13:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.562935 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.563157 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.563250 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.563360 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.563447 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:47Z","lastTransitionTime":"2025-09-30T13:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.665906 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.666135 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.666259 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.666498 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.666703 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:47Z","lastTransitionTime":"2025-09-30T13:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.768654 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.768944 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.769034 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.769124 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.769202 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:47Z","lastTransitionTime":"2025-09-30T13:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.871711 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.872027 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.872132 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.872249 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.872349 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:47Z","lastTransitionTime":"2025-09-30T13:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.974453 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.974669 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.974797 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.974860 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:47 crc kubenswrapper[4936]: I0930 13:40:47.974916 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:47Z","lastTransitionTime":"2025-09-30T13:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.076801 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.076834 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.076845 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.076859 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.076868 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:48Z","lastTransitionTime":"2025-09-30T13:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.082178 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs\") pod \"network-metrics-daemon-2v46m\" (UID: \"e3bd8048-3efa-41ed-a7ff-8d477db72be7\") " pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:48 crc kubenswrapper[4936]: E0930 13:40:48.082298 4936 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:40:48 crc kubenswrapper[4936]: E0930 13:40:48.082373 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs podName:e3bd8048-3efa-41ed-a7ff-8d477db72be7 nodeName:}" failed. No retries permitted until 2025-09-30 13:41:52.082352229 +0000 UTC m=+162.466354550 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs") pod "network-metrics-daemon-2v46m" (UID: "e3bd8048-3efa-41ed-a7ff-8d477db72be7") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.179749 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.179780 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.179789 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.179803 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.179812 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:48Z","lastTransitionTime":"2025-09-30T13:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.282225 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.282496 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.282560 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.282631 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.282692 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:48Z","lastTransitionTime":"2025-09-30T13:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.385923 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.385969 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.385983 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.385997 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.386007 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:48Z","lastTransitionTime":"2025-09-30T13:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.488017 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.488060 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.488071 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.488090 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.488102 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:48Z","lastTransitionTime":"2025-09-30T13:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.591089 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.591130 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.591143 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.591217 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.591230 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:48Z","lastTransitionTime":"2025-09-30T13:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.693928 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.694275 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.694395 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.694505 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.694628 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:48Z","lastTransitionTime":"2025-09-30T13:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.797423 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.797742 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.797895 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.798041 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.798165 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:48Z","lastTransitionTime":"2025-09-30T13:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.901237 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.901275 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.901285 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.901305 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:48 crc kubenswrapper[4936]: I0930 13:40:48.901317 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:48Z","lastTransitionTime":"2025-09-30T13:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.003470 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.003798 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.003875 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.003957 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.004052 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:49Z","lastTransitionTime":"2025-09-30T13:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.106005 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.106275 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.106547 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.106652 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.106729 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:49Z","lastTransitionTime":"2025-09-30T13:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.208977 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.209294 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.209424 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.209520 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.209603 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:49Z","lastTransitionTime":"2025-09-30T13:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.311794 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.312029 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.312241 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.312432 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.312593 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:49Z","lastTransitionTime":"2025-09-30T13:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.315101 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.315227 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.315263 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:49 crc kubenswrapper[4936]: E0930 13:40:49.315774 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:49 crc kubenswrapper[4936]: E0930 13:40:49.316422 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.316443 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:49 crc kubenswrapper[4936]: E0930 13:40:49.316868 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:49 crc kubenswrapper[4936]: E0930 13:40:49.316512 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.414819 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.414853 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.414861 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.414874 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.414884 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:49Z","lastTransitionTime":"2025-09-30T13:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.518026 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.518067 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.518076 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.518090 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.518101 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:49Z","lastTransitionTime":"2025-09-30T13:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.619997 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.620041 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.620053 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.620069 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.620080 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:49Z","lastTransitionTime":"2025-09-30T13:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.722646 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.722720 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.722742 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.722774 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.722795 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:49Z","lastTransitionTime":"2025-09-30T13:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.825221 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.825286 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.825305 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.825363 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.825382 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:49Z","lastTransitionTime":"2025-09-30T13:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.927454 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.927922 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.928154 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.928426 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:49 crc kubenswrapper[4936]: I0930 13:40:49.928667 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:49Z","lastTransitionTime":"2025-09-30T13:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.030886 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.030925 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.030938 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.030993 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.031005 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:50Z","lastTransitionTime":"2025-09-30T13:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.133366 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.133402 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.133413 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.133436 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.133447 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:50Z","lastTransitionTime":"2025-09-30T13:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.235569 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.235619 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.235671 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.235705 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.235718 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:50Z","lastTransitionTime":"2025-09-30T13:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.315219 4936 scope.go:117] "RemoveContainer" containerID="c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35" Sep 30 13:40:50 crc kubenswrapper[4936]: E0930 13:40:50.315368 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.328243 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v46m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3bd8048-3efa-41ed-a7ff-8d477db72be7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxrg7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v46m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.337377 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.337453 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.337473 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.337492 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.337507 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:50Z","lastTransitionTime":"2025-09-30T13:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.341303 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343bae65-9a21-439f-8504-8282fa99521a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36efc866bfcf654adad4f3aeede87a62797b56218312270c77ff1fa2829e7829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d418e49a69f8b1b8d79a9cc97646c14da3a657c4ec56c34d8e4eed2e1ddd54f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c0273aec15c711dbfb26f0c775189c81d13bc1333a331c358587200e3de1c31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6625580c07bb67ac50a6ffa9e19fff88bb9602760eda24b0affd6579717c7ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32fb8806b2e33912727e98aa439e5b78b58f4b0b49db1d8f21f9d2718e8a89ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:39:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 13:39:30.042966 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 13:39:30.043119 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:39:30.044043 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1119305182/tls.crt::/tmp/serving-cert-1119305182/tls.key\\\\\\\"\\\\nI0930 13:39:30.399724 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:39:30.403038 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:39:30.403062 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:39:30.403110 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:39:30.403117 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:39:30.407874 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:39:30.408257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408352 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:39:30.408406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0930 13:39:30.408418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0930 13:39:30.408435 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:39:30.408463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:39:30.408470 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 13:39:30.410200 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fef95497e86a6c4528ff9d18f06e7a810c5e29788d8ebe5fbbadcecfa100bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee01375887cdeead8c7ecf3380c08af3cfe6a1ba22948977de2ad709ba1fc5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.353451 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe3e8e86abb77549da694eda0ba0a1c7cacdbc47ca92074598de7b7868ebe35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.368418 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e09d215c-5c94-4b2a-bc68-c51a84b784a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d99e1b89c6acf44b8ee10fdd91ebbff421090ac9cec32801704bb89cf049b193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426nz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wj4sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.397023 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"166715eb-a672-4111-b64e-626a0f7b0d74\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:40:23Z\\\",\\\"message\\\":\\\"39 6852 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 13:40:23.060251 6852 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 13:40:23.060257 6852 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 13:40:23.060261 6852 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:40:23.060272 6852 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 13:40:23.060281 6852 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 13:40:23.060282 6852 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 13:40:23.060289 6852 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:40:23.060295 6852 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 13:40:23.060437 6852 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 13:40:23.060484 6852 factory.go:656] Stopping watch factory\\\\nI0930 13:40:23.060509 6852 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 13:40:23.060510 6852 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:40:23.060532 6852 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:40:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:39:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:39:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h77cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7vnws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.409158 4936 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fx6ff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be826921-6363-4c9a-9167-3af8e59e042d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://142d821643a05756cda30e6c0695cd53dae3c55db04d376d2a3ac15b00165727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wzfh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:39:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fx6ff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:40:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.441765 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.441804 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.441814 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.441827 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.441837 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:50Z","lastTransitionTime":"2025-09-30T13:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.445004 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=13.444982376 podStartE2EDuration="13.444982376s" podCreationTimestamp="2025-09-30 13:40:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:40:50.429453318 +0000 UTC m=+100.813455619" watchObservedRunningTime="2025-09-30 13:40:50.444982376 +0000 UTC m=+100.828984677" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.523510 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=12.523495774 podStartE2EDuration="12.523495774s" podCreationTimestamp="2025-09-30 13:40:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:40:50.509879516 +0000 UTC m=+100.893881817" watchObservedRunningTime="2025-09-30 13:40:50.523495774 +0000 UTC m=+100.907498075" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.542047 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=76.542026796 podStartE2EDuration="1m16.542026796s" podCreationTimestamp="2025-09-30 13:39:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:40:50.524237997 +0000 UTC m=+100.908240298" watchObservedRunningTime="2025-09-30 13:40:50.542026796 +0000 UTC m=+100.926029097" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.544067 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.544142 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.544152 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.544166 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.544175 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:50Z","lastTransitionTime":"2025-09-30T13:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.560990 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jzqxn" podStartSLOduration=80.560970962 podStartE2EDuration="1m20.560970962s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:40:50.560646772 +0000 UTC m=+100.944649073" watchObservedRunningTime="2025-09-30 13:40:50.560970962 +0000 UTC m=+100.944973263" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.574182 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xwqb6" podStartSLOduration=80.574164967 podStartE2EDuration="1m20.574164967s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:40:50.57203301 +0000 UTC m=+100.956035311" watchObservedRunningTime="2025-09-30 13:40:50.574164967 +0000 UTC m=+100.958167268" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.593666 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=50.593651289 podStartE2EDuration="50.593651289s" podCreationTimestamp="2025-09-30 13:40:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:40:50.592982558 +0000 UTC m=+100.976984879" watchObservedRunningTime="2025-09-30 13:40:50.593651289 +0000 UTC m=+100.977653590" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.628392 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vxjrh" podStartSLOduration=80.628370041 podStartE2EDuration="1m20.628370041s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:40:50.627748421 +0000 UTC m=+101.011750722" watchObservedRunningTime="2025-09-30 13:40:50.628370041 +0000 UTC m=+101.012372362" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.628846 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5zj44" podStartSLOduration=81.628841215 podStartE2EDuration="1m21.628841215s" podCreationTimestamp="2025-09-30 13:39:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:40:50.615514367 +0000 UTC m=+100.999516668" watchObservedRunningTime="2025-09-30 13:40:50.628841215 +0000 UTC m=+101.012843516" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.646705 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.646752 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.646763 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.646781 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.646792 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:50Z","lastTransitionTime":"2025-09-30T13:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.749065 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.749253 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.749269 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.749283 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.749292 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:50Z","lastTransitionTime":"2025-09-30T13:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.853776 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.853828 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.853841 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.853859 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.853873 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:50Z","lastTransitionTime":"2025-09-30T13:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.956022 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.956069 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.956079 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.956093 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:50 crc kubenswrapper[4936]: I0930 13:40:50.956104 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:50Z","lastTransitionTime":"2025-09-30T13:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.058555 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.058586 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.058594 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.058606 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.058616 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:51Z","lastTransitionTime":"2025-09-30T13:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.182849 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.182932 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.182957 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.182992 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.183031 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:51Z","lastTransitionTime":"2025-09-30T13:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.285830 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.285873 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.285884 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.285900 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.285909 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:51Z","lastTransitionTime":"2025-09-30T13:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.314535 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.314550 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.314550 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.314639 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:51 crc kubenswrapper[4936]: E0930 13:40:51.314782 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:51 crc kubenswrapper[4936]: E0930 13:40:51.314942 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:51 crc kubenswrapper[4936]: E0930 13:40:51.315052 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:51 crc kubenswrapper[4936]: E0930 13:40:51.315281 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.388103 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.388137 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.388149 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.388166 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.388181 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:51Z","lastTransitionTime":"2025-09-30T13:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.491058 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.491093 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.491105 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.491119 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.491128 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:51Z","lastTransitionTime":"2025-09-30T13:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.593578 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.593614 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.593624 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.593638 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.593648 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:51Z","lastTransitionTime":"2025-09-30T13:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.696127 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.696165 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.696176 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.696218 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.696229 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:51Z","lastTransitionTime":"2025-09-30T13:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.799436 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.799476 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.799486 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.799499 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.799509 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:51Z","lastTransitionTime":"2025-09-30T13:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.902368 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.902408 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.902419 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.902433 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:51 crc kubenswrapper[4936]: I0930 13:40:51.902443 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:51Z","lastTransitionTime":"2025-09-30T13:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.005039 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.005101 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.005112 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.005126 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.005136 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:52Z","lastTransitionTime":"2025-09-30T13:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.113887 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.113920 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.113929 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.113943 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.113953 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:52Z","lastTransitionTime":"2025-09-30T13:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.215895 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.215969 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.215979 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.215995 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.216006 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:52Z","lastTransitionTime":"2025-09-30T13:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.258802 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.258854 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.258863 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.258880 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.258889 4936 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:40:52Z","lastTransitionTime":"2025-09-30T13:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.303116 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc"] Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.303508 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.305160 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.305509 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.306045 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.306210 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.329494 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fx6ff" podStartSLOduration=83.329472552 podStartE2EDuration="1m23.329472552s" podCreationTimestamp="2025-09-30 13:39:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:40:52.31444607 +0000 UTC m=+102.698448401" watchObservedRunningTime="2025-09-30 13:40:52.329472552 +0000 UTC m=+102.713474853" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.342081 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=82.342058398 podStartE2EDuration="1m22.342058398s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:40:52.330755883 +0000 UTC m=+102.714758194" watchObservedRunningTime="2025-09-30 13:40:52.342058398 +0000 UTC m=+102.726060699" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.353533 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podStartSLOduration=82.353520028 podStartE2EDuration="1m22.353520028s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:40:52.352881198 +0000 UTC m=+102.736883499" watchObservedRunningTime="2025-09-30 13:40:52.353520028 +0000 UTC m=+102.737522329" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.425583 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0796df4b-e146-4524-bd7d-20977bc0301d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kjdkc\" (UID: \"0796df4b-e146-4524-bd7d-20977bc0301d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.425657 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0796df4b-e146-4524-bd7d-20977bc0301d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kjdkc\" (UID: \"0796df4b-e146-4524-bd7d-20977bc0301d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.425845 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0796df4b-e146-4524-bd7d-20977bc0301d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kjdkc\" (UID: \"0796df4b-e146-4524-bd7d-20977bc0301d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.425888 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0796df4b-e146-4524-bd7d-20977bc0301d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kjdkc\" (UID: \"0796df4b-e146-4524-bd7d-20977bc0301d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.425931 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0796df4b-e146-4524-bd7d-20977bc0301d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kjdkc\" (UID: \"0796df4b-e146-4524-bd7d-20977bc0301d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.527234 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0796df4b-e146-4524-bd7d-20977bc0301d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kjdkc\" (UID: \"0796df4b-e146-4524-bd7d-20977bc0301d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.527282 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0796df4b-e146-4524-bd7d-20977bc0301d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kjdkc\" (UID: \"0796df4b-e146-4524-bd7d-20977bc0301d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.527371 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0796df4b-e146-4524-bd7d-20977bc0301d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kjdkc\" (UID: \"0796df4b-e146-4524-bd7d-20977bc0301d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.527451 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0796df4b-e146-4524-bd7d-20977bc0301d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kjdkc\" (UID: \"0796df4b-e146-4524-bd7d-20977bc0301d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.527369 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0796df4b-e146-4524-bd7d-20977bc0301d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kjdkc\" (UID: \"0796df4b-e146-4524-bd7d-20977bc0301d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.527535 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0796df4b-e146-4524-bd7d-20977bc0301d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kjdkc\" (UID: \"0796df4b-e146-4524-bd7d-20977bc0301d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.527481 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0796df4b-e146-4524-bd7d-20977bc0301d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kjdkc\" (UID: \"0796df4b-e146-4524-bd7d-20977bc0301d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.528264 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0796df4b-e146-4524-bd7d-20977bc0301d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kjdkc\" (UID: \"0796df4b-e146-4524-bd7d-20977bc0301d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.539997 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0796df4b-e146-4524-bd7d-20977bc0301d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kjdkc\" (UID: \"0796df4b-e146-4524-bd7d-20977bc0301d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.552111 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0796df4b-e146-4524-bd7d-20977bc0301d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kjdkc\" (UID: \"0796df4b-e146-4524-bd7d-20977bc0301d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc" Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.618967 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc" Sep 30 13:40:52 crc kubenswrapper[4936]: W0930 13:40:52.634496 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0796df4b_e146_4524_bd7d_20977bc0301d.slice/crio-7b60ae0c2a11ca3b2063d03e9b74bed066285aef4ef923771329f7f9c13636d8 WatchSource:0}: Error finding container 7b60ae0c2a11ca3b2063d03e9b74bed066285aef4ef923771329f7f9c13636d8: Status 404 returned error can't find the container with id 7b60ae0c2a11ca3b2063d03e9b74bed066285aef4ef923771329f7f9c13636d8 Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.879584 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc" event={"ID":"0796df4b-e146-4524-bd7d-20977bc0301d","Type":"ContainerStarted","Data":"a2d35bbde64cf0d6a1bb1d373913dea388de3e3cf3a8c9a65f97e7cb3a1a29b7"} Sep 30 13:40:52 crc kubenswrapper[4936]: I0930 13:40:52.879655 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc" event={"ID":"0796df4b-e146-4524-bd7d-20977bc0301d","Type":"ContainerStarted","Data":"7b60ae0c2a11ca3b2063d03e9b74bed066285aef4ef923771329f7f9c13636d8"} Sep 30 13:40:53 crc kubenswrapper[4936]: I0930 13:40:53.314586 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:53 crc kubenswrapper[4936]: I0930 13:40:53.315114 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:53 crc kubenswrapper[4936]: E0930 13:40:53.315720 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:53 crc kubenswrapper[4936]: I0930 13:40:53.315781 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:53 crc kubenswrapper[4936]: E0930 13:40:53.315878 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:53 crc kubenswrapper[4936]: E0930 13:40:53.316452 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:53 crc kubenswrapper[4936]: I0930 13:40:53.315504 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:53 crc kubenswrapper[4936]: E0930 13:40:53.317018 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:55 crc kubenswrapper[4936]: I0930 13:40:55.314358 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:55 crc kubenswrapper[4936]: I0930 13:40:55.314387 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:55 crc kubenswrapper[4936]: E0930 13:40:55.314799 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:55 crc kubenswrapper[4936]: I0930 13:40:55.314458 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:55 crc kubenswrapper[4936]: I0930 13:40:55.314428 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:55 crc kubenswrapper[4936]: E0930 13:40:55.314943 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:55 crc kubenswrapper[4936]: E0930 13:40:55.315017 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:55 crc kubenswrapper[4936]: E0930 13:40:55.315085 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:57 crc kubenswrapper[4936]: I0930 13:40:57.315157 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:57 crc kubenswrapper[4936]: I0930 13:40:57.315190 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:57 crc kubenswrapper[4936]: E0930 13:40:57.316083 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:57 crc kubenswrapper[4936]: I0930 13:40:57.315286 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:57 crc kubenswrapper[4936]: E0930 13:40:57.315930 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:57 crc kubenswrapper[4936]: E0930 13:40:57.316236 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:57 crc kubenswrapper[4936]: I0930 13:40:57.315297 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:57 crc kubenswrapper[4936]: E0930 13:40:57.316318 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:40:59 crc kubenswrapper[4936]: I0930 13:40:59.314986 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:40:59 crc kubenswrapper[4936]: E0930 13:40:59.315880 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:40:59 crc kubenswrapper[4936]: I0930 13:40:59.315025 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:40:59 crc kubenswrapper[4936]: E0930 13:40:59.316163 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:40:59 crc kubenswrapper[4936]: I0930 13:40:59.315024 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:40:59 crc kubenswrapper[4936]: E0930 13:40:59.316428 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:40:59 crc kubenswrapper[4936]: I0930 13:40:59.315061 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:40:59 crc kubenswrapper[4936]: E0930 13:40:59.316683 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:41:01 crc kubenswrapper[4936]: I0930 13:41:01.314270 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:41:01 crc kubenswrapper[4936]: E0930 13:41:01.314470 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:41:01 crc kubenswrapper[4936]: I0930 13:41:01.314501 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:41:01 crc kubenswrapper[4936]: I0930 13:41:01.314591 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:41:01 crc kubenswrapper[4936]: I0930 13:41:01.314912 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:41:01 crc kubenswrapper[4936]: E0930 13:41:01.314903 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:41:01 crc kubenswrapper[4936]: E0930 13:41:01.315469 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:41:01 crc kubenswrapper[4936]: I0930 13:41:01.315742 4936 scope.go:117] "RemoveContainer" containerID="c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35" Sep 30 13:41:01 crc kubenswrapper[4936]: E0930 13:41:01.315936 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7vnws_openshift-ovn-kubernetes(166715eb-a672-4111-b64e-626a0f7b0d74)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" Sep 30 13:41:01 crc kubenswrapper[4936]: E0930 13:41:01.316137 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:41:03 crc kubenswrapper[4936]: I0930 13:41:03.314866 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:41:03 crc kubenswrapper[4936]: I0930 13:41:03.314949 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:41:03 crc kubenswrapper[4936]: E0930 13:41:03.315726 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:41:03 crc kubenswrapper[4936]: I0930 13:41:03.315075 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:41:03 crc kubenswrapper[4936]: E0930 13:41:03.315804 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:41:03 crc kubenswrapper[4936]: I0930 13:41:03.315011 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:41:03 crc kubenswrapper[4936]: E0930 13:41:03.316089 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:41:03 crc kubenswrapper[4936]: E0930 13:41:03.315997 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:41:04 crc kubenswrapper[4936]: I0930 13:41:04.916744 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vxjrh_9dbb1e3f-927e-4587-835e-b21370b33262/kube-multus/1.log" Sep 30 13:41:04 crc kubenswrapper[4936]: I0930 13:41:04.917841 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vxjrh_9dbb1e3f-927e-4587-835e-b21370b33262/kube-multus/0.log" Sep 30 13:41:04 crc kubenswrapper[4936]: I0930 13:41:04.918252 4936 generic.go:334] "Generic (PLEG): container finished" podID="9dbb1e3f-927e-4587-835e-b21370b33262" containerID="c2dd4dee574c3aee3fe81fee19f41aa90b0a6340eb8677847a2006a1ba906e34" exitCode=1 Sep 30 13:41:04 crc kubenswrapper[4936]: I0930 13:41:04.918379 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vxjrh" event={"ID":"9dbb1e3f-927e-4587-835e-b21370b33262","Type":"ContainerDied","Data":"c2dd4dee574c3aee3fe81fee19f41aa90b0a6340eb8677847a2006a1ba906e34"} Sep 30 13:41:04 crc kubenswrapper[4936]: I0930 13:41:04.918497 4936 scope.go:117] "RemoveContainer" containerID="0b5058a1a7b45401af8901db8eb454dcac72b5a37555a8da544fbd7d073296c5" Sep 30 13:41:04 crc kubenswrapper[4936]: I0930 13:41:04.919374 4936 scope.go:117] "RemoveContainer" containerID="c2dd4dee574c3aee3fe81fee19f41aa90b0a6340eb8677847a2006a1ba906e34" Sep 30 13:41:04 crc kubenswrapper[4936]: E0930 13:41:04.922535 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-vxjrh_openshift-multus(9dbb1e3f-927e-4587-835e-b21370b33262)\"" pod="openshift-multus/multus-vxjrh" podUID="9dbb1e3f-927e-4587-835e-b21370b33262" Sep 30 13:41:04 crc kubenswrapper[4936]: I0930 13:41:04.939103 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kjdkc" podStartSLOduration=94.939084867 podStartE2EDuration="1m34.939084867s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:40:52.898026524 +0000 UTC m=+103.282028835" watchObservedRunningTime="2025-09-30 13:41:04.939084867 +0000 UTC m=+115.323087188" Sep 30 13:41:05 crc kubenswrapper[4936]: I0930 13:41:05.315520 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:41:05 crc kubenswrapper[4936]: I0930 13:41:05.315563 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:41:05 crc kubenswrapper[4936]: I0930 13:41:05.315711 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:41:05 crc kubenswrapper[4936]: E0930 13:41:05.315704 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:41:05 crc kubenswrapper[4936]: E0930 13:41:05.315933 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:41:05 crc kubenswrapper[4936]: E0930 13:41:05.316039 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:41:05 crc kubenswrapper[4936]: I0930 13:41:05.316461 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:41:05 crc kubenswrapper[4936]: E0930 13:41:05.316582 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:41:05 crc kubenswrapper[4936]: I0930 13:41:05.924652 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vxjrh_9dbb1e3f-927e-4587-835e-b21370b33262/kube-multus/1.log" Sep 30 13:41:07 crc kubenswrapper[4936]: I0930 13:41:07.314742 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:41:07 crc kubenswrapper[4936]: I0930 13:41:07.314842 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:41:07 crc kubenswrapper[4936]: E0930 13:41:07.314883 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:41:07 crc kubenswrapper[4936]: I0930 13:41:07.314842 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:41:07 crc kubenswrapper[4936]: E0930 13:41:07.315064 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:41:07 crc kubenswrapper[4936]: E0930 13:41:07.315199 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:41:07 crc kubenswrapper[4936]: I0930 13:41:07.315463 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:41:07 crc kubenswrapper[4936]: E0930 13:41:07.315532 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:41:09 crc kubenswrapper[4936]: I0930 13:41:09.314648 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:41:09 crc kubenswrapper[4936]: E0930 13:41:09.315048 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:41:09 crc kubenswrapper[4936]: I0930 13:41:09.314681 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:41:09 crc kubenswrapper[4936]: I0930 13:41:09.314739 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:41:09 crc kubenswrapper[4936]: E0930 13:41:09.315204 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:41:09 crc kubenswrapper[4936]: E0930 13:41:09.315263 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:41:09 crc kubenswrapper[4936]: I0930 13:41:09.314733 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:41:09 crc kubenswrapper[4936]: E0930 13:41:09.315369 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:41:10 crc kubenswrapper[4936]: E0930 13:41:10.289623 4936 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Sep 30 13:41:10 crc kubenswrapper[4936]: E0930 13:41:10.397455 4936 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 13:41:11 crc kubenswrapper[4936]: I0930 13:41:11.314833 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:41:11 crc kubenswrapper[4936]: I0930 13:41:11.314935 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:41:11 crc kubenswrapper[4936]: E0930 13:41:11.314972 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:41:11 crc kubenswrapper[4936]: E0930 13:41:11.315185 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:41:11 crc kubenswrapper[4936]: I0930 13:41:11.315212 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:41:11 crc kubenswrapper[4936]: I0930 13:41:11.315232 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:41:11 crc kubenswrapper[4936]: E0930 13:41:11.315275 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:41:11 crc kubenswrapper[4936]: E0930 13:41:11.315352 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:41:13 crc kubenswrapper[4936]: I0930 13:41:13.315271 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:41:13 crc kubenswrapper[4936]: I0930 13:41:13.315452 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:41:13 crc kubenswrapper[4936]: E0930 13:41:13.315544 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:41:13 crc kubenswrapper[4936]: I0930 13:41:13.315578 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:41:13 crc kubenswrapper[4936]: I0930 13:41:13.315578 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:41:13 crc kubenswrapper[4936]: E0930 13:41:13.315687 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:41:13 crc kubenswrapper[4936]: E0930 13:41:13.315772 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:41:13 crc kubenswrapper[4936]: E0930 13:41:13.315845 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:41:15 crc kubenswrapper[4936]: I0930 13:41:15.314518 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:41:15 crc kubenswrapper[4936]: I0930 13:41:15.314601 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:41:15 crc kubenswrapper[4936]: I0930 13:41:15.314694 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:41:15 crc kubenswrapper[4936]: E0930 13:41:15.314682 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:41:15 crc kubenswrapper[4936]: I0930 13:41:15.314805 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:41:15 crc kubenswrapper[4936]: E0930 13:41:15.315077 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:41:15 crc kubenswrapper[4936]: E0930 13:41:15.315558 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:41:15 crc kubenswrapper[4936]: E0930 13:41:15.315690 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:41:15 crc kubenswrapper[4936]: I0930 13:41:15.315929 4936 scope.go:117] "RemoveContainer" containerID="c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35" Sep 30 13:41:15 crc kubenswrapper[4936]: E0930 13:41:15.399949 4936 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 13:41:15 crc kubenswrapper[4936]: I0930 13:41:15.954664 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vnws_166715eb-a672-4111-b64e-626a0f7b0d74/ovnkube-controller/3.log" Sep 30 13:41:15 crc kubenswrapper[4936]: I0930 13:41:15.957635 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerStarted","Data":"53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302"} Sep 30 13:41:15 crc kubenswrapper[4936]: I0930 13:41:15.958177 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:41:15 crc kubenswrapper[4936]: I0930 13:41:15.986222 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" podStartSLOduration=105.98620534 podStartE2EDuration="1m45.98620534s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:15.98525502 +0000 UTC m=+126.369257331" watchObservedRunningTime="2025-09-30 13:41:15.98620534 +0000 UTC m=+126.370207641" Sep 30 13:41:16 crc kubenswrapper[4936]: I0930 13:41:16.315303 4936 scope.go:117] "RemoveContainer" containerID="c2dd4dee574c3aee3fe81fee19f41aa90b0a6340eb8677847a2006a1ba906e34" Sep 30 13:41:16 crc kubenswrapper[4936]: I0930 13:41:16.408157 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2v46m"] Sep 30 13:41:16 crc kubenswrapper[4936]: I0930 13:41:16.408251 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:41:16 crc kubenswrapper[4936]: E0930 13:41:16.408324 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:41:16 crc kubenswrapper[4936]: I0930 13:41:16.962655 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vxjrh_9dbb1e3f-927e-4587-835e-b21370b33262/kube-multus/1.log" Sep 30 13:41:16 crc kubenswrapper[4936]: I0930 13:41:16.962926 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vxjrh" event={"ID":"9dbb1e3f-927e-4587-835e-b21370b33262","Type":"ContainerStarted","Data":"326557f59eb0f93aaa69b1eb33489ff2543bcec53c69e20c05066c6bef73b97e"} Sep 30 13:41:17 crc kubenswrapper[4936]: I0930 13:41:17.314574 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:41:17 crc kubenswrapper[4936]: I0930 13:41:17.314724 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:41:17 crc kubenswrapper[4936]: E0930 13:41:17.314909 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:41:17 crc kubenswrapper[4936]: I0930 13:41:17.314957 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:41:17 crc kubenswrapper[4936]: E0930 13:41:17.315106 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:41:17 crc kubenswrapper[4936]: E0930 13:41:17.315303 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:41:18 crc kubenswrapper[4936]: I0930 13:41:18.314509 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:41:18 crc kubenswrapper[4936]: E0930 13:41:18.314997 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:41:19 crc kubenswrapper[4936]: I0930 13:41:19.314708 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:41:19 crc kubenswrapper[4936]: I0930 13:41:19.314786 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:41:19 crc kubenswrapper[4936]: E0930 13:41:19.314826 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:41:19 crc kubenswrapper[4936]: E0930 13:41:19.314911 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:41:19 crc kubenswrapper[4936]: I0930 13:41:19.314786 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:41:19 crc kubenswrapper[4936]: E0930 13:41:19.314976 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:41:20 crc kubenswrapper[4936]: I0930 13:41:20.314851 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:41:20 crc kubenswrapper[4936]: E0930 13:41:20.315739 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v46m" podUID="e3bd8048-3efa-41ed-a7ff-8d477db72be7" Sep 30 13:41:21 crc kubenswrapper[4936]: I0930 13:41:21.314963 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:41:21 crc kubenswrapper[4936]: I0930 13:41:21.314971 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:41:21 crc kubenswrapper[4936]: I0930 13:41:21.314998 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:41:21 crc kubenswrapper[4936]: I0930 13:41:21.319018 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Sep 30 13:41:21 crc kubenswrapper[4936]: I0930 13:41:21.319200 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Sep 30 13:41:21 crc kubenswrapper[4936]: I0930 13:41:21.319411 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Sep 30 13:41:21 crc kubenswrapper[4936]: I0930 13:41:21.319524 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Sep 30 13:41:22 crc kubenswrapper[4936]: I0930 13:41:22.314858 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:41:22 crc kubenswrapper[4936]: I0930 13:41:22.317072 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Sep 30 13:41:22 crc kubenswrapper[4936]: I0930 13:41:22.318376 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.004244 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.079642 4936 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.121104 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-2nljs"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.121748 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.122163 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-njdd2"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.122705 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-njdd2" Sep 30 13:41:23 crc kubenswrapper[4936]: W0930 13:41:23.126586 4936 reflector.go:561] object-"openshift-authentication-operator"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Sep 30 13:41:23 crc kubenswrapper[4936]: E0930 13:41:23.126849 4936 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 13:41:23 crc kubenswrapper[4936]: W0930 13:41:23.126654 4936 reflector.go:561] object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w": failed to list *v1.Secret: secrets "cluster-samples-operator-dockercfg-xpp9w" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-samples-operator": no relationship found between node 'crc' and this object Sep 30 13:41:23 crc kubenswrapper[4936]: E0930 13:41:23.126993 4936 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-xpp9w\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cluster-samples-operator-dockercfg-xpp9w\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-samples-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 13:41:23 crc kubenswrapper[4936]: W0930 13:41:23.127483 4936 reflector.go:561] object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj": failed to list *v1.Secret: secrets "authentication-operator-dockercfg-mz9bj" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Sep 30 13:41:23 crc kubenswrapper[4936]: E0930 13:41:23.127575 4936 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"authentication-operator-dockercfg-mz9bj\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"authentication-operator-dockercfg-mz9bj\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 13:41:23 crc kubenswrapper[4936]: W0930 13:41:23.127594 4936 reflector.go:561] object-"openshift-authentication-operator"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Sep 30 13:41:23 crc kubenswrapper[4936]: E0930 13:41:23.127705 4936 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.127611 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Sep 30 13:41:23 crc kubenswrapper[4936]: W0930 13:41:23.127629 4936 reflector.go:561] object-"openshift-authentication-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Sep 30 13:41:23 crc kubenswrapper[4936]: E0930 13:41:23.127879 4936 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 13:41:23 crc kubenswrapper[4936]: W0930 13:41:23.128919 4936 reflector.go:561] object-"openshift-cluster-samples-operator"/"samples-operator-tls": failed to list *v1.Secret: secrets "samples-operator-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-samples-operator": no relationship found between node 'crc' and this object Sep 30 13:41:23 crc kubenswrapper[4936]: E0930 13:41:23.128974 4936 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"samples-operator-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-samples-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.129044 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.129153 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: W0930 13:41:23.129247 4936 reflector.go:561] object-"openshift-authentication-operator"/"service-ca-bundle": failed to list *v1.ConfigMap: configmaps "service-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Sep 30 13:41:23 crc kubenswrapper[4936]: E0930 13:41:23.129276 4936 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"service-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"service-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.130062 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7ccwq"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.130582 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.130816 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hj57l"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.130582 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.131298 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: W0930 13:41:23.145712 4936 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-session": failed to list *v1.Secret: secrets "v4-0-config-system-session" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.145751 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Sep 30 13:41:23 crc kubenswrapper[4936]: E0930 13:41:23.145760 4936 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-session\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-session\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.145875 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.145899 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.146042 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.146485 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.146613 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.146641 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.146954 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.147348 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.147430 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.147534 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.152570 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.153121 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.155749 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-djmjn"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.156227 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.159657 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.159677 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.161791 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-jl85m"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.178938 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.179501 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.180044 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.180205 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.180277 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.180446 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.180671 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.180827 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gc5z6"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.181079 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.181193 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.181230 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gc5z6" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.181635 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.181715 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.181842 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.181985 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.182039 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-rbjr5"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.182151 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.182547 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbjr5" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.182552 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.183818 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-nfcbn"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.183912 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.184186 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.185162 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xm8pb"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.186896 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.187116 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nfcbn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.188422 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-rw728"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.188993 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-rw728" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.189354 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xm8pb" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.191413 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cbz9v"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.193590 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d5kqn"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.194026 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-d5kqn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.194945 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cbz9v" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.207640 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqd7b"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.208051 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.208452 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.208783 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqd7b" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.213525 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.213670 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.213803 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.214988 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6rwjr"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.215534 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6rwjr" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.215658 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.215843 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fg254"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.216306 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.216453 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.216699 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.216810 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.217129 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.217206 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.217375 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.217598 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.217643 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.217658 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-sht9l"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.217645 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.217773 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.217794 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.217824 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.217599 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.217867 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.217920 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.217972 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.218254 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.221370 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.221534 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.221632 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.221769 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.221840 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.221897 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.222036 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.222057 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.222149 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.222204 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.222281 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.222424 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.223952 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.224239 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.224491 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.224771 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.226091 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.230603 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.231901 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.232175 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4vgz6"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.233243 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.233879 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.234119 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.235108 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.235290 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.235945 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.236034 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.236586 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.237558 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.241322 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.241944 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.247512 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/63461350-882b-40f6-8651-d6273f3e5a2d-metrics-tls\") pod \"dns-operator-744455d44c-d5kqn\" (UID: \"63461350-882b-40f6-8651-d6273f3e5a2d\") " pod="openshift-dns-operator/dns-operator-744455d44c-d5kqn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.247555 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/836a4387-b928-437f-a758-289ece3ff594-client-ca\") pod \"controller-manager-879f6c89f-djmjn\" (UID: \"836a4387-b928-437f-a758-289ece3ff594\") " pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.247588 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgvkl\" (UniqueName: \"kubernetes.io/projected/50b3ddf6-dac3-4108-a740-978dcd73ef6a-kube-api-access-vgvkl\") pod \"machine-approver-56656f9798-rbjr5\" (UID: \"50b3ddf6-dac3-4108-a740-978dcd73ef6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbjr5" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.247694 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7tcq\" (UniqueName: \"kubernetes.io/projected/63461350-882b-40f6-8651-d6273f3e5a2d-kube-api-access-g7tcq\") pod \"dns-operator-744455d44c-d5kqn\" (UID: \"63461350-882b-40f6-8651-d6273f3e5a2d\") " pod="openshift-dns-operator/dns-operator-744455d44c-d5kqn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.247725 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhckg\" (UniqueName: \"kubernetes.io/projected/e16bbb17-ee43-4b34-885f-0e042fcde913-kube-api-access-vhckg\") pod \"cluster-samples-operator-665b6dd947-njdd2\" (UID: \"e16bbb17-ee43-4b34-885f-0e042fcde913\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-njdd2" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.247756 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb040b17-af49-45f7-9405-2f75256dbbe4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xm8pb\" (UID: \"eb040b17-af49-45f7-9405-2f75256dbbe4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xm8pb" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.247797 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gc5z6\" (UID: \"2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gc5z6" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.247824 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz7gd\" (UniqueName: \"kubernetes.io/projected/0c8003ab-2870-41d5-a1c7-30dd4232d184-kube-api-access-bz7gd\") pod \"openshift-config-operator-7777fb866f-6nvzg\" (UID: \"0c8003ab-2870-41d5-a1c7-30dd4232d184\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.247850 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gc5z6\" (UID: \"2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gc5z6" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.248362 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.248540 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.253796 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w5j7\" (UniqueName: \"kubernetes.io/projected/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-kube-api-access-5w5j7\") pod \"route-controller-manager-6576b87f9c-gd7hl\" (UID: \"b58ad99c-bbed-4e80-9cc1-f281c2072fbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.254028 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-trusted-ca-bundle\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.258454 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.258922 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.259056 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.259117 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pchrm\" (UniqueName: \"kubernetes.io/projected/10b0a93a-9b8f-48d1-bdc2-defe765f1fab-kube-api-access-pchrm\") pod \"openshift-controller-manager-operator-756b6f6bc6-cbz9v\" (UID: \"10b0a93a-9b8f-48d1-bdc2-defe765f1fab\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cbz9v" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.259195 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7e5e231-b700-4151-81c8-111a3af3bfc2-console-oauth-config\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.259265 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a641ed11-580d-41ac-967c-e145d80b03fa-etcd-serving-ca\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.259299 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-service-ca\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.259362 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06045f3a-af69-49c7-9759-915cd9fb4c65-audit-dir\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.259505 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a641ed11-580d-41ac-967c-e145d80b03fa-audit-dir\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.259535 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/836a4387-b928-437f-a758-289ece3ff594-serving-cert\") pod \"controller-manager-879f6c89f-djmjn\" (UID: \"836a4387-b928-437f-a758-289ece3ff594\") " pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.259758 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-serving-cert\") pod \"route-controller-manager-6576b87f9c-gd7hl\" (UID: \"b58ad99c-bbed-4e80-9cc1-f281c2072fbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.259911 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-2nljs\" (UID: \"7cd84bc0-8b2a-45a0-bda3-e1cad2d795df\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.259944 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.260062 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-client-ca\") pod \"route-controller-manager-6576b87f9c-gd7hl\" (UID: \"b58ad99c-bbed-4e80-9cc1-f281c2072fbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.260089 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.260115 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-console-config\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.260241 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c8003ab-2870-41d5-a1c7-30dd4232d184-serving-cert\") pod \"openshift-config-operator-7777fb866f-6nvzg\" (UID: \"0c8003ab-2870-41d5-a1c7-30dd4232d184\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.260266 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gc5z6\" (UID: \"2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gc5z6" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.260424 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7tsk\" (UniqueName: \"kubernetes.io/projected/27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30-kube-api-access-m7tsk\") pod \"machine-api-operator-5694c8668f-rw728\" (UID: \"27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rw728" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.260471 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.260504 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-config\") pod \"authentication-operator-69f744f599-2nljs\" (UID: \"7cd84bc0-8b2a-45a0-bda3-e1cad2d795df\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.260517 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.260537 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a641ed11-580d-41ac-967c-e145d80b03fa-node-pullsecrets\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.260590 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.260616 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30-images\") pod \"machine-api-operator-5694c8668f-rw728\" (UID: \"27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rw728" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.260628 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.260599 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.260733 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.260832 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a641ed11-580d-41ac-967c-e145d80b03fa-encryption-config\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261040 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261068 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10b0a93a-9b8f-48d1-bdc2-defe765f1fab-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-cbz9v\" (UID: \"10b0a93a-9b8f-48d1-bdc2-defe765f1fab\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cbz9v" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261102 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59lrk\" (UniqueName: \"kubernetes.io/projected/2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474-kube-api-access-59lrk\") pod \"cluster-image-registry-operator-dc59b4c8b-gc5z6\" (UID: \"2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gc5z6" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261126 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-audit-policies\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261149 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261172 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7e5e231-b700-4151-81c8-111a3af3bfc2-console-serving-cert\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261197 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6v25\" (UniqueName: \"kubernetes.io/projected/eb040b17-af49-45f7-9405-2f75256dbbe4-kube-api-access-g6v25\") pod \"openshift-apiserver-operator-796bbdcf4f-xm8pb\" (UID: \"eb040b17-af49-45f7-9405-2f75256dbbe4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xm8pb" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261223 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhl4g\" (UniqueName: \"kubernetes.io/projected/a641ed11-580d-41ac-967c-e145d80b03fa-kube-api-access-xhl4g\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261246 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-serving-cert\") pod \"authentication-operator-69f744f599-2nljs\" (UID: \"7cd84bc0-8b2a-45a0-bda3-e1cad2d795df\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261267 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a641ed11-580d-41ac-967c-e145d80b03fa-audit\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261290 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/836a4387-b928-437f-a758-289ece3ff594-config\") pod \"controller-manager-879f6c89f-djmjn\" (UID: \"836a4387-b928-437f-a758-289ece3ff594\") " pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261314 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261362 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a641ed11-580d-41ac-967c-e145d80b03fa-image-import-ca\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261408 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e16bbb17-ee43-4b34-885f-0e042fcde913-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-njdd2\" (UID: \"e16bbb17-ee43-4b34-885f-0e042fcde913\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-njdd2" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261432 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a641ed11-580d-41ac-967c-e145d80b03fa-serving-cert\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261457 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a641ed11-580d-41ac-967c-e145d80b03fa-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261480 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-config\") pod \"route-controller-manager-6576b87f9c-gd7hl\" (UID: \"b58ad99c-bbed-4e80-9cc1-f281c2072fbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261506 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261531 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf5d5\" (UniqueName: \"kubernetes.io/projected/c7e5e231-b700-4151-81c8-111a3af3bfc2-kube-api-access-pf5d5\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261558 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10b0a93a-9b8f-48d1-bdc2-defe765f1fab-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-cbz9v\" (UID: \"10b0a93a-9b8f-48d1-bdc2-defe765f1fab\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cbz9v" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261580 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261602 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a641ed11-580d-41ac-967c-e145d80b03fa-etcd-client\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261624 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb040b17-af49-45f7-9405-2f75256dbbe4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xm8pb\" (UID: \"eb040b17-af49-45f7-9405-2f75256dbbe4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xm8pb" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261652 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0c8003ab-2870-41d5-a1c7-30dd4232d184-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6nvzg\" (UID: \"0c8003ab-2870-41d5-a1c7-30dd4232d184\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261677 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-oauth-serving-cert\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261704 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-service-ca-bundle\") pod \"authentication-operator-69f744f599-2nljs\" (UID: \"7cd84bc0-8b2a-45a0-bda3-e1cad2d795df\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261783 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnrcp\" (UniqueName: \"kubernetes.io/projected/836a4387-b928-437f-a758-289ece3ff594-kube-api-access-wnrcp\") pod \"controller-manager-879f6c89f-djmjn\" (UID: \"836a4387-b928-437f-a758-289ece3ff594\") " pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261814 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30-config\") pod \"machine-api-operator-5694c8668f-rw728\" (UID: \"27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rw728" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261834 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50b3ddf6-dac3-4108-a740-978dcd73ef6a-config\") pod \"machine-approver-56656f9798-rbjr5\" (UID: \"50b3ddf6-dac3-4108-a740-978dcd73ef6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbjr5" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261856 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261877 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d8f7\" (UniqueName: \"kubernetes.io/projected/06045f3a-af69-49c7-9759-915cd9fb4c65-kube-api-access-8d8f7\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.261897 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqmj4\" (UniqueName: \"kubernetes.io/projected/d50867fd-81e4-416d-a112-a84b175be026-kube-api-access-lqmj4\") pod \"downloads-7954f5f757-nfcbn\" (UID: \"d50867fd-81e4-416d-a112-a84b175be026\") " pod="openshift-console/downloads-7954f5f757-nfcbn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.262061 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/50b3ddf6-dac3-4108-a740-978dcd73ef6a-auth-proxy-config\") pod \"machine-approver-56656f9798-rbjr5\" (UID: \"50b3ddf6-dac3-4108-a740-978dcd73ef6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbjr5" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.262089 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.262130 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87x2g\" (UniqueName: \"kubernetes.io/projected/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-kube-api-access-87x2g\") pod \"authentication-operator-69f744f599-2nljs\" (UID: \"7cd84bc0-8b2a-45a0-bda3-e1cad2d795df\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.262159 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-rw728\" (UID: \"27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rw728" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.262182 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/50b3ddf6-dac3-4108-a740-978dcd73ef6a-machine-approver-tls\") pod \"machine-approver-56656f9798-rbjr5\" (UID: \"50b3ddf6-dac3-4108-a740-978dcd73ef6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbjr5" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.262201 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/836a4387-b928-437f-a758-289ece3ff594-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-djmjn\" (UID: \"836a4387-b928-437f-a758-289ece3ff594\") " pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.262242 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a641ed11-580d-41ac-967c-e145d80b03fa-config\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.262432 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.262579 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.262719 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.263395 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75ftv"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.263642 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.263826 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75ftv" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.264053 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.271712 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.274216 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.274447 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-29b8m"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.274913 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wd2th"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.275291 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-2nljs"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.277120 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-29b8m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.277473 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-njdd2"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.277612 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hj57l"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.278544 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-c49vt"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.279090 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c49vt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.279269 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.280519 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.289543 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-nldp8"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.290075 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mpz7"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.290358 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f8s8f"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.290567 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-nldp8" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.290756 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f8s8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.291002 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mpz7" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.291212 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6mxph"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.291291 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wd2th" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.291742 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6mxph" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.291948 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7jxz9"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.292251 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.294174 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.294925 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.298007 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6wchs"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.298414 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.298550 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v22qc"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.301454 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6wchs" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.314993 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.316046 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7ccwq"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.316080 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7jh2g"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.316802 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-n8p2c"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.317464 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.318280 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.326776 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7jh2g" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.326849 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-n8p2c" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.326878 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.327227 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v22qc" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.327233 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-plg5s"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.329360 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-djmjn"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.329463 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-plg5s" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.333796 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.337248 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-jl85m"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.337603 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.352461 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.353828 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cbz9v"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.355198 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xm8pb"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.362837 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a641ed11-580d-41ac-967c-e145d80b03fa-serving-cert\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.362869 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a641ed11-580d-41ac-967c-e145d80b03fa-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.362885 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-config\") pod \"route-controller-manager-6576b87f9c-gd7hl\" (UID: \"b58ad99c-bbed-4e80-9cc1-f281c2072fbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.362903 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea3912a-d1e4-4c08-81d1-3788ce6096e6-metrics-certs\") pod \"router-default-5444994796-sht9l\" (UID: \"2ea3912a-d1e4-4c08-81d1-3788ce6096e6\") " pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.362919 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2ccd121-a5fc-4f11-b256-70b80420eb21-serving-cert\") pod \"console-operator-58897d9998-6rwjr\" (UID: \"c2ccd121-a5fc-4f11-b256-70b80420eb21\") " pod="openshift-console-operator/console-operator-58897d9998-6rwjr" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.362934 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ccd121-a5fc-4f11-b256-70b80420eb21-config\") pod \"console-operator-58897d9998-6rwjr\" (UID: \"c2ccd121-a5fc-4f11-b256-70b80420eb21\") " pod="openshift-console-operator/console-operator-58897d9998-6rwjr" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.362956 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2667a269-9771-4873-8ed1-6781e6aab9bf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-nldp8\" (UID: \"2667a269-9771-4873-8ed1-6781e6aab9bf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nldp8" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.362974 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.362993 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10b0a93a-9b8f-48d1-bdc2-defe765f1fab-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-cbz9v\" (UID: \"10b0a93a-9b8f-48d1-bdc2-defe765f1fab\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cbz9v" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363010 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363032 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf5d5\" (UniqueName: \"kubernetes.io/projected/c7e5e231-b700-4151-81c8-111a3af3bfc2-kube-api-access-pf5d5\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363047 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a641ed11-580d-41ac-967c-e145d80b03fa-etcd-client\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363061 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb040b17-af49-45f7-9405-2f75256dbbe4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xm8pb\" (UID: \"eb040b17-af49-45f7-9405-2f75256dbbe4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xm8pb" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363077 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmmkf\" (UniqueName: \"kubernetes.io/projected/9881cb2e-950d-4550-b04a-8254d9581cd1-kube-api-access-fmmkf\") pod \"migrator-59844c95c7-wd2th\" (UID: \"9881cb2e-950d-4550-b04a-8254d9581cd1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wd2th" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363093 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0c8003ab-2870-41d5-a1c7-30dd4232d184-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6nvzg\" (UID: \"0c8003ab-2870-41d5-a1c7-30dd4232d184\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363110 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-oauth-serving-cert\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363128 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2ea3912a-d1e4-4c08-81d1-3788ce6096e6-default-certificate\") pod \"router-default-5444994796-sht9l\" (UID: \"2ea3912a-d1e4-4c08-81d1-3788ce6096e6\") " pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363143 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1664a2e3-61da-4eea-bdba-06b422cbb9b6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363161 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-service-ca-bundle\") pod \"authentication-operator-69f744f599-2nljs\" (UID: \"7cd84bc0-8b2a-45a0-bda3-e1cad2d795df\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363177 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnrcp\" (UniqueName: \"kubernetes.io/projected/836a4387-b928-437f-a758-289ece3ff594-kube-api-access-wnrcp\") pod \"controller-manager-879f6c89f-djmjn\" (UID: \"836a4387-b928-437f-a758-289ece3ff594\") " pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363193 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30-config\") pod \"machine-api-operator-5694c8668f-rw728\" (UID: \"27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rw728" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363216 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50b3ddf6-dac3-4108-a740-978dcd73ef6a-config\") pod \"machine-approver-56656f9798-rbjr5\" (UID: \"50b3ddf6-dac3-4108-a740-978dcd73ef6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbjr5" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363232 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363248 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d8f7\" (UniqueName: \"kubernetes.io/projected/06045f3a-af69-49c7-9759-915cd9fb4c65-kube-api-access-8d8f7\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363264 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqmj4\" (UniqueName: \"kubernetes.io/projected/d50867fd-81e4-416d-a112-a84b175be026-kube-api-access-lqmj4\") pod \"downloads-7954f5f757-nfcbn\" (UID: \"d50867fd-81e4-416d-a112-a84b175be026\") " pod="openshift-console/downloads-7954f5f757-nfcbn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363281 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/50b3ddf6-dac3-4108-a740-978dcd73ef6a-auth-proxy-config\") pod \"machine-approver-56656f9798-rbjr5\" (UID: \"50b3ddf6-dac3-4108-a740-978dcd73ef6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbjr5" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363299 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363316 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87x2g\" (UniqueName: \"kubernetes.io/projected/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-kube-api-access-87x2g\") pod \"authentication-operator-69f744f599-2nljs\" (UID: \"7cd84bc0-8b2a-45a0-bda3-e1cad2d795df\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363347 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1664a2e3-61da-4eea-bdba-06b422cbb9b6-etcd-client\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363363 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4f46\" (UniqueName: \"kubernetes.io/projected/1664a2e3-61da-4eea-bdba-06b422cbb9b6-kube-api-access-h4f46\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363380 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlqzb\" (UniqueName: \"kubernetes.io/projected/2667a269-9771-4873-8ed1-6781e6aab9bf-kube-api-access-nlqzb\") pod \"multus-admission-controller-857f4d67dd-nldp8\" (UID: \"2667a269-9771-4873-8ed1-6781e6aab9bf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nldp8" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363398 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-rw728\" (UID: \"27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rw728" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363413 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/50b3ddf6-dac3-4108-a740-978dcd73ef6a-machine-approver-tls\") pod \"machine-approver-56656f9798-rbjr5\" (UID: \"50b3ddf6-dac3-4108-a740-978dcd73ef6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbjr5" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363739 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a641ed11-580d-41ac-967c-e145d80b03fa-config\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363759 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/836a4387-b928-437f-a758-289ece3ff594-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-djmjn\" (UID: \"836a4387-b928-437f-a758-289ece3ff594\") " pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363782 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/63461350-882b-40f6-8651-d6273f3e5a2d-metrics-tls\") pod \"dns-operator-744455d44c-d5kqn\" (UID: \"63461350-882b-40f6-8651-d6273f3e5a2d\") " pod="openshift-dns-operator/dns-operator-744455d44c-d5kqn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363796 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/836a4387-b928-437f-a758-289ece3ff594-client-ca\") pod \"controller-manager-879f6c89f-djmjn\" (UID: \"836a4387-b928-437f-a758-289ece3ff594\") " pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363812 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgvkl\" (UniqueName: \"kubernetes.io/projected/50b3ddf6-dac3-4108-a740-978dcd73ef6a-kube-api-access-vgvkl\") pod \"machine-approver-56656f9798-rbjr5\" (UID: \"50b3ddf6-dac3-4108-a740-978dcd73ef6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbjr5" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363827 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7tcq\" (UniqueName: \"kubernetes.io/projected/63461350-882b-40f6-8651-d6273f3e5a2d-kube-api-access-g7tcq\") pod \"dns-operator-744455d44c-d5kqn\" (UID: \"63461350-882b-40f6-8651-d6273f3e5a2d\") " pod="openshift-dns-operator/dns-operator-744455d44c-d5kqn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363844 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhckg\" (UniqueName: \"kubernetes.io/projected/e16bbb17-ee43-4b34-885f-0e042fcde913-kube-api-access-vhckg\") pod \"cluster-samples-operator-665b6dd947-njdd2\" (UID: \"e16bbb17-ee43-4b34-885f-0e042fcde913\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-njdd2" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363859 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb040b17-af49-45f7-9405-2f75256dbbe4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xm8pb\" (UID: \"eb040b17-af49-45f7-9405-2f75256dbbe4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xm8pb" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363875 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gc5z6\" (UID: \"2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gc5z6" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363892 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz7gd\" (UniqueName: \"kubernetes.io/projected/0c8003ab-2870-41d5-a1c7-30dd4232d184-kube-api-access-bz7gd\") pod \"openshift-config-operator-7777fb866f-6nvzg\" (UID: \"0c8003ab-2870-41d5-a1c7-30dd4232d184\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363908 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gc5z6\" (UID: \"2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gc5z6" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363924 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w5j7\" (UniqueName: \"kubernetes.io/projected/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-kube-api-access-5w5j7\") pod \"route-controller-manager-6576b87f9c-gd7hl\" (UID: \"b58ad99c-bbed-4e80-9cc1-f281c2072fbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363941 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1664a2e3-61da-4eea-bdba-06b422cbb9b6-encryption-config\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363966 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-trusted-ca-bundle\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363981 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d9470f-af56-43d0-9d14-90930c95615a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9mpz7\" (UID: \"33d9470f-af56-43d0-9d14-90930c95615a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mpz7" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.363995 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea3912a-d1e4-4c08-81d1-3788ce6096e6-service-ca-bundle\") pod \"router-default-5444994796-sht9l\" (UID: \"2ea3912a-d1e4-4c08-81d1-3788ce6096e6\") " pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364012 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pchrm\" (UniqueName: \"kubernetes.io/projected/10b0a93a-9b8f-48d1-bdc2-defe765f1fab-kube-api-access-pchrm\") pod \"openshift-controller-manager-operator-756b6f6bc6-cbz9v\" (UID: \"10b0a93a-9b8f-48d1-bdc2-defe765f1fab\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cbz9v" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364030 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7e5e231-b700-4151-81c8-111a3af3bfc2-console-oauth-config\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364045 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a641ed11-580d-41ac-967c-e145d80b03fa-etcd-serving-ca\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364060 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-service-ca\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364075 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1664a2e3-61da-4eea-bdba-06b422cbb9b6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364093 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06045f3a-af69-49c7-9759-915cd9fb4c65-audit-dir\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364123 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a641ed11-580d-41ac-967c-e145d80b03fa-audit-dir\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364137 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/836a4387-b928-437f-a758-289ece3ff594-serving-cert\") pod \"controller-manager-879f6c89f-djmjn\" (UID: \"836a4387-b928-437f-a758-289ece3ff594\") " pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364153 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2ea3912a-d1e4-4c08-81d1-3788ce6096e6-stats-auth\") pod \"router-default-5444994796-sht9l\" (UID: \"2ea3912a-d1e4-4c08-81d1-3788ce6096e6\") " pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364169 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2ccd121-a5fc-4f11-b256-70b80420eb21-trusted-ca\") pod \"console-operator-58897d9998-6rwjr\" (UID: \"c2ccd121-a5fc-4f11-b256-70b80420eb21\") " pod="openshift-console-operator/console-operator-58897d9998-6rwjr" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364188 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-2nljs\" (UID: \"7cd84bc0-8b2a-45a0-bda3-e1cad2d795df\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364204 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364221 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-client-ca\") pod \"route-controller-manager-6576b87f9c-gd7hl\" (UID: \"b58ad99c-bbed-4e80-9cc1-f281c2072fbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364240 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-serving-cert\") pod \"route-controller-manager-6576b87f9c-gd7hl\" (UID: \"b58ad99c-bbed-4e80-9cc1-f281c2072fbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364262 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1664a2e3-61da-4eea-bdba-06b422cbb9b6-audit-dir\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364290 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364312 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c8003ab-2870-41d5-a1c7-30dd4232d184-serving-cert\") pod \"openshift-config-operator-7777fb866f-6nvzg\" (UID: \"0c8003ab-2870-41d5-a1c7-30dd4232d184\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364350 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gc5z6\" (UID: \"2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gc5z6" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364368 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7tsk\" (UniqueName: \"kubernetes.io/projected/27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30-kube-api-access-m7tsk\") pod \"machine-api-operator-5694c8668f-rw728\" (UID: \"27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rw728" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364385 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364400 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-console-config\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364415 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lzfn\" (UniqueName: \"kubernetes.io/projected/c2ccd121-a5fc-4f11-b256-70b80420eb21-kube-api-access-7lzfn\") pod \"console-operator-58897d9998-6rwjr\" (UID: \"c2ccd121-a5fc-4f11-b256-70b80420eb21\") " pod="openshift-console-operator/console-operator-58897d9998-6rwjr" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364433 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-config\") pod \"authentication-operator-69f744f599-2nljs\" (UID: \"7cd84bc0-8b2a-45a0-bda3-e1cad2d795df\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364448 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a641ed11-580d-41ac-967c-e145d80b03fa-node-pullsecrets\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364466 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364486 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30-images\") pod \"machine-api-operator-5694c8668f-rw728\" (UID: \"27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rw728" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364495 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50b3ddf6-dac3-4108-a740-978dcd73ef6a-config\") pod \"machine-approver-56656f9798-rbjr5\" (UID: \"50b3ddf6-dac3-4108-a740-978dcd73ef6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbjr5" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.369795 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/50b3ddf6-dac3-4108-a740-978dcd73ef6a-auth-proxy-config\") pod \"machine-approver-56656f9798-rbjr5\" (UID: \"50b3ddf6-dac3-4108-a740-978dcd73ef6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbjr5" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.370679 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.371009 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a641ed11-580d-41ac-967c-e145d80b03fa-etcd-client\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.372030 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a641ed11-580d-41ac-967c-e145d80b03fa-serving-cert\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.372056 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.372100 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.372861 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a641ed11-580d-41ac-967c-e145d80b03fa-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.373172 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.373461 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-rw728\" (UID: \"27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rw728" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.373628 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d5kqn"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.374202 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.374235 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06045f3a-af69-49c7-9759-915cd9fb4c65-audit-dir\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.374266 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a641ed11-580d-41ac-967c-e145d80b03fa-audit-dir\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.375305 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10b0a93a-9b8f-48d1-bdc2-defe765f1fab-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-cbz9v\" (UID: \"10b0a93a-9b8f-48d1-bdc2-defe765f1fab\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cbz9v" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.375585 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a641ed11-580d-41ac-967c-e145d80b03fa-encryption-config\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.375943 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/50b3ddf6-dac3-4108-a740-978dcd73ef6a-machine-approver-tls\") pod \"machine-approver-56656f9798-rbjr5\" (UID: \"50b3ddf6-dac3-4108-a740-978dcd73ef6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbjr5" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.376412 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a641ed11-580d-41ac-967c-e145d80b03fa-config\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.377091 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-config\") pod \"route-controller-manager-6576b87f9c-gd7hl\" (UID: \"b58ad99c-bbed-4e80-9cc1-f281c2072fbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.377552 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/836a4387-b928-437f-a758-289ece3ff594-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-djmjn\" (UID: \"836a4387-b928-437f-a758-289ece3ff594\") " pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.377584 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0c8003ab-2870-41d5-a1c7-30dd4232d184-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6nvzg\" (UID: \"0c8003ab-2870-41d5-a1c7-30dd4232d184\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.379777 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.379950 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb040b17-af49-45f7-9405-2f75256dbbe4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xm8pb\" (UID: \"eb040b17-af49-45f7-9405-2f75256dbbe4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xm8pb" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.364509 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a641ed11-580d-41ac-967c-e145d80b03fa-encryption-config\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.380034 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.380077 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33d9470f-af56-43d0-9d14-90930c95615a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9mpz7\" (UID: \"33d9470f-af56-43d0-9d14-90930c95615a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mpz7" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.380610 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30-config\") pod \"machine-api-operator-5694c8668f-rw728\" (UID: \"27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rw728" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.381031 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6rwjr"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.381702 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-oauth-serving-cert\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.382308 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/836a4387-b928-437f-a758-289ece3ff594-client-ca\") pod \"controller-manager-879f6c89f-djmjn\" (UID: \"836a4387-b928-437f-a758-289ece3ff594\") " pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.384511 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gc5z6\" (UID: \"2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gc5z6" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.385038 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-client-ca\") pod \"route-controller-manager-6576b87f9c-gd7hl\" (UID: \"b58ad99c-bbed-4e80-9cc1-f281c2072fbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.385078 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nfcbn"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.387217 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-trusted-ca-bundle\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.387449 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.387491 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.388244 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb040b17-af49-45f7-9405-2f75256dbbe4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xm8pb\" (UID: \"eb040b17-af49-45f7-9405-2f75256dbbe4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xm8pb" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.388641 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.388693 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a641ed11-580d-41ac-967c-e145d80b03fa-etcd-serving-ca\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.388915 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1664a2e3-61da-4eea-bdba-06b422cbb9b6-serving-cert\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.389029 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10b0a93a-9b8f-48d1-bdc2-defe765f1fab-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-cbz9v\" (UID: \"10b0a93a-9b8f-48d1-bdc2-defe765f1fab\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cbz9v" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.389064 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59lrk\" (UniqueName: \"kubernetes.io/projected/2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474-kube-api-access-59lrk\") pod \"cluster-image-registry-operator-dc59b4c8b-gc5z6\" (UID: \"2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gc5z6" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.389098 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-audit-policies\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.389126 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.389151 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7e5e231-b700-4151-81c8-111a3af3bfc2-console-serving-cert\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.389175 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6v25\" (UniqueName: \"kubernetes.io/projected/eb040b17-af49-45f7-9405-2f75256dbbe4-kube-api-access-g6v25\") pod \"openshift-apiserver-operator-796bbdcf4f-xm8pb\" (UID: \"eb040b17-af49-45f7-9405-2f75256dbbe4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xm8pb" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.389189 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-service-ca\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.389201 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhl4g\" (UniqueName: \"kubernetes.io/projected/a641ed11-580d-41ac-967c-e145d80b03fa-kube-api-access-xhl4g\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.389232 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqzt5\" (UniqueName: \"kubernetes.io/projected/2ea3912a-d1e4-4c08-81d1-3788ce6096e6-kube-api-access-kqzt5\") pod \"router-default-5444994796-sht9l\" (UID: \"2ea3912a-d1e4-4c08-81d1-3788ce6096e6\") " pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.389260 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-serving-cert\") pod \"authentication-operator-69f744f599-2nljs\" (UID: \"7cd84bc0-8b2a-45a0-bda3-e1cad2d795df\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.389287 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a641ed11-580d-41ac-967c-e145d80b03fa-audit\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.389314 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/836a4387-b928-437f-a758-289ece3ff594-config\") pod \"controller-manager-879f6c89f-djmjn\" (UID: \"836a4387-b928-437f-a758-289ece3ff594\") " pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.389357 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.389384 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1664a2e3-61da-4eea-bdba-06b422cbb9b6-audit-policies\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.389413 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a641ed11-580d-41ac-967c-e145d80b03fa-image-import-ca\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.389464 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e16bbb17-ee43-4b34-885f-0e042fcde913-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-njdd2\" (UID: \"e16bbb17-ee43-4b34-885f-0e042fcde913\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-njdd2" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.389490 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv8nd\" (UniqueName: \"kubernetes.io/projected/33d9470f-af56-43d0-9d14-90930c95615a-kube-api-access-kv8nd\") pod \"kube-storage-version-migrator-operator-b67b599dd-9mpz7\" (UID: \"33d9470f-af56-43d0-9d14-90930c95615a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mpz7" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.389584 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-audit-policies\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.390049 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10b0a93a-9b8f-48d1-bdc2-defe765f1fab-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-cbz9v\" (UID: \"10b0a93a-9b8f-48d1-bdc2-defe765f1fab\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cbz9v" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.390193 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a641ed11-580d-41ac-967c-e145d80b03fa-node-pullsecrets\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.390299 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.391236 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/63461350-882b-40f6-8651-d6273f3e5a2d-metrics-tls\") pod \"dns-operator-744455d44c-d5kqn\" (UID: \"63461350-882b-40f6-8651-d6273f3e5a2d\") " pod="openshift-dns-operator/dns-operator-744455d44c-d5kqn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.392076 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30-images\") pod \"machine-api-operator-5694c8668f-rw728\" (UID: \"27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rw728" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.392211 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gc5z6"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.392751 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a641ed11-580d-41ac-967c-e145d80b03fa-audit\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.392753 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqd7b"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.394267 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.394529 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-config\") pod \"authentication-operator-69f744f599-2nljs\" (UID: \"7cd84bc0-8b2a-45a0-bda3-e1cad2d795df\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.395057 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-console-config\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.395621 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fg254"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.395677 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a641ed11-580d-41ac-967c-e145d80b03fa-image-import-ca\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.396170 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.398565 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7e5e231-b700-4151-81c8-111a3af3bfc2-console-oauth-config\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.398639 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.400000 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f8s8f"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.400176 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gc5z6\" (UID: \"2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gc5z6" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.400425 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/836a4387-b928-437f-a758-289ece3ff594-serving-cert\") pod \"controller-manager-879f6c89f-djmjn\" (UID: \"836a4387-b928-437f-a758-289ece3ff594\") " pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.401306 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-c49vt"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.405044 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.405454 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c8003ab-2870-41d5-a1c7-30dd4232d184-serving-cert\") pod \"openshift-config-operator-7777fb866f-6nvzg\" (UID: \"0c8003ab-2870-41d5-a1c7-30dd4232d184\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.405677 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.408034 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7e5e231-b700-4151-81c8-111a3af3bfc2-console-serving-cert\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.408372 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.410054 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-serving-cert\") pod \"route-controller-manager-6576b87f9c-gd7hl\" (UID: \"b58ad99c-bbed-4e80-9cc1-f281c2072fbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.410113 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.410144 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-xz7g4"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.410147 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/836a4387-b928-437f-a758-289ece3ff594-config\") pod \"controller-manager-879f6c89f-djmjn\" (UID: \"836a4387-b928-437f-a758-289ece3ff594\") " pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.410894 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-xz7g4" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.411192 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mqxlc"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.412251 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.412340 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.415237 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-29b8m"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.415383 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mpz7"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.415458 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7jxz9"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.419063 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4vgz6"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.419109 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-plg5s"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.419119 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wd2th"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.423754 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.433625 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-rw728"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.433693 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75ftv"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.439103 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7jh2g"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.441008 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6mxph"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.441055 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.448429 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.448703 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-nldp8"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.454742 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bjbnj"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.455791 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6wchs"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.455952 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bjbnj" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.458608 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.463397 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.466381 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bjbnj"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.466970 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v22qc"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.468780 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mqxlc"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.470087 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-n8p2c"] Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.484720 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.494980 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1664a2e3-61da-4eea-bdba-06b422cbb9b6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495021 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2ea3912a-d1e4-4c08-81d1-3788ce6096e6-stats-auth\") pod \"router-default-5444994796-sht9l\" (UID: \"2ea3912a-d1e4-4c08-81d1-3788ce6096e6\") " pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495041 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2ccd121-a5fc-4f11-b256-70b80420eb21-trusted-ca\") pod \"console-operator-58897d9998-6rwjr\" (UID: \"c2ccd121-a5fc-4f11-b256-70b80420eb21\") " pod="openshift-console-operator/console-operator-58897d9998-6rwjr" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495078 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1664a2e3-61da-4eea-bdba-06b422cbb9b6-audit-dir\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495100 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lzfn\" (UniqueName: \"kubernetes.io/projected/c2ccd121-a5fc-4f11-b256-70b80420eb21-kube-api-access-7lzfn\") pod \"console-operator-58897d9998-6rwjr\" (UID: \"c2ccd121-a5fc-4f11-b256-70b80420eb21\") " pod="openshift-console-operator/console-operator-58897d9998-6rwjr" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495126 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33d9470f-af56-43d0-9d14-90930c95615a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9mpz7\" (UID: \"33d9470f-af56-43d0-9d14-90930c95615a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mpz7" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495190 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1664a2e3-61da-4eea-bdba-06b422cbb9b6-serving-cert\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495224 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqzt5\" (UniqueName: \"kubernetes.io/projected/2ea3912a-d1e4-4c08-81d1-3788ce6096e6-kube-api-access-kqzt5\") pod \"router-default-5444994796-sht9l\" (UID: \"2ea3912a-d1e4-4c08-81d1-3788ce6096e6\") " pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495244 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1664a2e3-61da-4eea-bdba-06b422cbb9b6-audit-policies\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495281 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv8nd\" (UniqueName: \"kubernetes.io/projected/33d9470f-af56-43d0-9d14-90930c95615a-kube-api-access-kv8nd\") pod \"kube-storage-version-migrator-operator-b67b599dd-9mpz7\" (UID: \"33d9470f-af56-43d0-9d14-90930c95615a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mpz7" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495305 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea3912a-d1e4-4c08-81d1-3788ce6096e6-metrics-certs\") pod \"router-default-5444994796-sht9l\" (UID: \"2ea3912a-d1e4-4c08-81d1-3788ce6096e6\") " pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495345 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2ccd121-a5fc-4f11-b256-70b80420eb21-serving-cert\") pod \"console-operator-58897d9998-6rwjr\" (UID: \"c2ccd121-a5fc-4f11-b256-70b80420eb21\") " pod="openshift-console-operator/console-operator-58897d9998-6rwjr" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495359 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2667a269-9771-4873-8ed1-6781e6aab9bf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-nldp8\" (UID: \"2667a269-9771-4873-8ed1-6781e6aab9bf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nldp8" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495374 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ccd121-a5fc-4f11-b256-70b80420eb21-config\") pod \"console-operator-58897d9998-6rwjr\" (UID: \"c2ccd121-a5fc-4f11-b256-70b80420eb21\") " pod="openshift-console-operator/console-operator-58897d9998-6rwjr" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495395 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmmkf\" (UniqueName: \"kubernetes.io/projected/9881cb2e-950d-4550-b04a-8254d9581cd1-kube-api-access-fmmkf\") pod \"migrator-59844c95c7-wd2th\" (UID: \"9881cb2e-950d-4550-b04a-8254d9581cd1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wd2th" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495410 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2ea3912a-d1e4-4c08-81d1-3788ce6096e6-default-certificate\") pod \"router-default-5444994796-sht9l\" (UID: \"2ea3912a-d1e4-4c08-81d1-3788ce6096e6\") " pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495453 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1664a2e3-61da-4eea-bdba-06b422cbb9b6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495522 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1664a2e3-61da-4eea-bdba-06b422cbb9b6-etcd-client\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495546 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4f46\" (UniqueName: \"kubernetes.io/projected/1664a2e3-61da-4eea-bdba-06b422cbb9b6-kube-api-access-h4f46\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495561 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlqzb\" (UniqueName: \"kubernetes.io/projected/2667a269-9771-4873-8ed1-6781e6aab9bf-kube-api-access-nlqzb\") pod \"multus-admission-controller-857f4d67dd-nldp8\" (UID: \"2667a269-9771-4873-8ed1-6781e6aab9bf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nldp8" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495619 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1664a2e3-61da-4eea-bdba-06b422cbb9b6-encryption-config\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495675 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d9470f-af56-43d0-9d14-90930c95615a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9mpz7\" (UID: \"33d9470f-af56-43d0-9d14-90930c95615a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mpz7" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.495733 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea3912a-d1e4-4c08-81d1-3788ce6096e6-service-ca-bundle\") pod \"router-default-5444994796-sht9l\" (UID: \"2ea3912a-d1e4-4c08-81d1-3788ce6096e6\") " pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.496721 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1664a2e3-61da-4eea-bdba-06b422cbb9b6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.497210 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2ccd121-a5fc-4f11-b256-70b80420eb21-trusted-ca\") pod \"console-operator-58897d9998-6rwjr\" (UID: \"c2ccd121-a5fc-4f11-b256-70b80420eb21\") " pod="openshift-console-operator/console-operator-58897d9998-6rwjr" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.497409 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1664a2e3-61da-4eea-bdba-06b422cbb9b6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.497498 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1664a2e3-61da-4eea-bdba-06b422cbb9b6-audit-policies\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.498093 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ccd121-a5fc-4f11-b256-70b80420eb21-config\") pod \"console-operator-58897d9998-6rwjr\" (UID: \"c2ccd121-a5fc-4f11-b256-70b80420eb21\") " pod="openshift-console-operator/console-operator-58897d9998-6rwjr" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.498189 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1664a2e3-61da-4eea-bdba-06b422cbb9b6-audit-dir\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.504763 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1664a2e3-61da-4eea-bdba-06b422cbb9b6-etcd-client\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.505155 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2ccd121-a5fc-4f11-b256-70b80420eb21-serving-cert\") pod \"console-operator-58897d9998-6rwjr\" (UID: \"c2ccd121-a5fc-4f11-b256-70b80420eb21\") " pod="openshift-console-operator/console-operator-58897d9998-6rwjr" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.505267 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1664a2e3-61da-4eea-bdba-06b422cbb9b6-encryption-config\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.508699 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1664a2e3-61da-4eea-bdba-06b422cbb9b6-serving-cert\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.512976 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.527290 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.545726 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.563543 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.583419 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.604349 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.611860 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2ea3912a-d1e4-4c08-81d1-3788ce6096e6-default-certificate\") pod \"router-default-5444994796-sht9l\" (UID: \"2ea3912a-d1e4-4c08-81d1-3788ce6096e6\") " pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.624955 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.628948 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2ea3912a-d1e4-4c08-81d1-3788ce6096e6-stats-auth\") pod \"router-default-5444994796-sht9l\" (UID: \"2ea3912a-d1e4-4c08-81d1-3788ce6096e6\") " pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.644635 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.649189 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea3912a-d1e4-4c08-81d1-3788ce6096e6-metrics-certs\") pod \"router-default-5444994796-sht9l\" (UID: \"2ea3912a-d1e4-4c08-81d1-3788ce6096e6\") " pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.663958 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.683969 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.686825 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ea3912a-d1e4-4c08-81d1-3788ce6096e6-service-ca-bundle\") pod \"router-default-5444994796-sht9l\" (UID: \"2ea3912a-d1e4-4c08-81d1-3788ce6096e6\") " pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.704044 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.744399 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.792527 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.792645 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.809766 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.824973 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.844103 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.863795 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.884847 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.904252 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.923806 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.944118 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.964969 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Sep 30 13:41:23 crc kubenswrapper[4936]: I0930 13:41:23.984446 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.004287 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.024455 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.043869 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.064194 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.083859 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.104015 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.124749 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.145365 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.152155 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2667a269-9771-4873-8ed1-6781e6aab9bf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-nldp8\" (UID: \"2667a269-9771-4873-8ed1-6781e6aab9bf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nldp8" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.164372 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.184813 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.204652 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.224395 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.243959 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.251153 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33d9470f-af56-43d0-9d14-90930c95615a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9mpz7\" (UID: \"33d9470f-af56-43d0-9d14-90930c95615a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mpz7" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.263932 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.284161 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.289650 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d9470f-af56-43d0-9d14-90930c95615a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9mpz7\" (UID: \"33d9470f-af56-43d0-9d14-90930c95615a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mpz7" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.302040 4936 request.go:700] Waited for 1.010803133s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpprof-cert&limit=500&resourceVersion=0 Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.303325 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.325388 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.344188 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.363935 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Sep 30 13:41:24 crc kubenswrapper[4936]: E0930 13:41:24.380082 4936 configmap.go:193] Couldn't get configMap openshift-authentication-operator/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Sep 30 13:41:24 crc kubenswrapper[4936]: E0930 13:41:24.380202 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-service-ca-bundle podName:7cd84bc0-8b2a-45a0-bda3-e1cad2d795df nodeName:}" failed. No retries permitted until 2025-09-30 13:41:24.880179076 +0000 UTC m=+135.264181377 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-service-ca-bundle") pod "authentication-operator-69f744f599-2nljs" (UID: "7cd84bc0-8b2a-45a0-bda3-e1cad2d795df") : failed to sync configmap cache: timed out waiting for the condition Sep 30 13:41:24 crc kubenswrapper[4936]: E0930 13:41:24.384013 4936 configmap.go:193] Couldn't get configMap openshift-authentication-operator/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Sep 30 13:41:24 crc kubenswrapper[4936]: E0930 13:41:24.384085 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-trusted-ca-bundle podName:7cd84bc0-8b2a-45a0-bda3-e1cad2d795df nodeName:}" failed. No retries permitted until 2025-09-30 13:41:24.88406784 +0000 UTC m=+135.268070141 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-trusted-ca-bundle") pod "authentication-operator-69f744f599-2nljs" (UID: "7cd84bc0-8b2a-45a0-bda3-e1cad2d795df") : failed to sync configmap cache: timed out waiting for the condition Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.384559 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Sep 30 13:41:24 crc kubenswrapper[4936]: E0930 13:41:24.393291 4936 secret.go:188] Couldn't get secret openshift-authentication-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Sep 30 13:41:24 crc kubenswrapper[4936]: E0930 13:41:24.393416 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-serving-cert podName:7cd84bc0-8b2a-45a0-bda3-e1cad2d795df nodeName:}" failed. No retries permitted until 2025-09-30 13:41:24.893390714 +0000 UTC m=+135.277393025 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-serving-cert") pod "authentication-operator-69f744f599-2nljs" (UID: "7cd84bc0-8b2a-45a0-bda3-e1cad2d795df") : failed to sync secret cache: timed out waiting for the condition Sep 30 13:41:24 crc kubenswrapper[4936]: E0930 13:41:24.394417 4936 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-session: failed to sync secret cache: timed out waiting for the condition Sep 30 13:41:24 crc kubenswrapper[4936]: E0930 13:41:24.394510 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-session podName:06045f3a-af69-49c7-9759-915cd9fb4c65 nodeName:}" failed. No retries permitted until 2025-09-30 13:41:24.894488666 +0000 UTC m=+135.278491027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-session") pod "oauth-openshift-558db77b4-7ccwq" (UID: "06045f3a-af69-49c7-9759-915cd9fb4c65") : failed to sync secret cache: timed out waiting for the condition Sep 30 13:41:24 crc kubenswrapper[4936]: E0930 13:41:24.396461 4936 secret.go:188] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: failed to sync secret cache: timed out waiting for the condition Sep 30 13:41:24 crc kubenswrapper[4936]: E0930 13:41:24.396510 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e16bbb17-ee43-4b34-885f-0e042fcde913-samples-operator-tls podName:e16bbb17-ee43-4b34-885f-0e042fcde913 nodeName:}" failed. No retries permitted until 2025-09-30 13:41:24.896500545 +0000 UTC m=+135.280502846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e16bbb17-ee43-4b34-885f-0e042fcde913-samples-operator-tls") pod "cluster-samples-operator-665b6dd947-njdd2" (UID: "e16bbb17-ee43-4b34-885f-0e042fcde913") : failed to sync secret cache: timed out waiting for the condition Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.402918 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.424075 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.444035 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.464378 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.485143 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.515852 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.524468 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.543615 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.563803 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.584068 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.603718 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.644401 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.663319 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.684984 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.704855 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.724308 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.743732 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.764048 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.785126 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.805070 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.824720 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.844576 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.864663 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.883944 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.903679 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.924157 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-serving-cert\") pod \"authentication-operator-69f744f599-2nljs\" (UID: \"7cd84bc0-8b2a-45a0-bda3-e1cad2d795df\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.925407 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.928435 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.928595 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e16bbb17-ee43-4b34-885f-0e042fcde913-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-njdd2\" (UID: \"e16bbb17-ee43-4b34-885f-0e042fcde913\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-njdd2" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.928729 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-service-ca-bundle\") pod \"authentication-operator-69f744f599-2nljs\" (UID: \"7cd84bc0-8b2a-45a0-bda3-e1cad2d795df\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.929723 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-2nljs\" (UID: \"7cd84bc0-8b2a-45a0-bda3-e1cad2d795df\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.944922 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.963754 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Sep 30 13:41:24 crc kubenswrapper[4936]: I0930 13:41:24.985056 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.026214 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d8f7\" (UniqueName: \"kubernetes.io/projected/06045f3a-af69-49c7-9759-915cd9fb4c65-kube-api-access-8d8f7\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.063111 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqmj4\" (UniqueName: \"kubernetes.io/projected/d50867fd-81e4-416d-a112-a84b175be026-kube-api-access-lqmj4\") pod \"downloads-7954f5f757-nfcbn\" (UID: \"d50867fd-81e4-416d-a112-a84b175be026\") " pod="openshift-console/downloads-7954f5f757-nfcbn" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.082903 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf5d5\" (UniqueName: \"kubernetes.io/projected/c7e5e231-b700-4151-81c8-111a3af3bfc2-kube-api-access-pf5d5\") pod \"console-f9d7485db-jl85m\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.083319 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.111943 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnrcp\" (UniqueName: \"kubernetes.io/projected/836a4387-b928-437f-a758-289ece3ff594-kube-api-access-wnrcp\") pod \"controller-manager-879f6c89f-djmjn\" (UID: \"836a4387-b928-437f-a758-289ece3ff594\") " pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.133296 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w5j7\" (UniqueName: \"kubernetes.io/projected/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-kube-api-access-5w5j7\") pod \"route-controller-manager-6576b87f9c-gd7hl\" (UID: \"b58ad99c-bbed-4e80-9cc1-f281c2072fbf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.152074 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgvkl\" (UniqueName: \"kubernetes.io/projected/50b3ddf6-dac3-4108-a740-978dcd73ef6a-kube-api-access-vgvkl\") pod \"machine-approver-56656f9798-rbjr5\" (UID: \"50b3ddf6-dac3-4108-a740-978dcd73ef6a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbjr5" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.166187 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7tcq\" (UniqueName: \"kubernetes.io/projected/63461350-882b-40f6-8651-d6273f3e5a2d-kube-api-access-g7tcq\") pod \"dns-operator-744455d44c-d5kqn\" (UID: \"63461350-882b-40f6-8651-d6273f3e5a2d\") " pod="openshift-dns-operator/dns-operator-744455d44c-d5kqn" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.173639 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbjr5" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.184668 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhckg\" (UniqueName: \"kubernetes.io/projected/e16bbb17-ee43-4b34-885f-0e042fcde913-kube-api-access-vhckg\") pod \"cluster-samples-operator-665b6dd947-njdd2\" (UID: \"e16bbb17-ee43-4b34-885f-0e042fcde913\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-njdd2" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.213256 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz7gd\" (UniqueName: \"kubernetes.io/projected/0c8003ab-2870-41d5-a1c7-30dd4232d184-kube-api-access-bz7gd\") pod \"openshift-config-operator-7777fb866f-6nvzg\" (UID: \"0c8003ab-2870-41d5-a1c7-30dd4232d184\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.230618 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nfcbn" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.231656 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pchrm\" (UniqueName: \"kubernetes.io/projected/10b0a93a-9b8f-48d1-bdc2-defe765f1fab-kube-api-access-pchrm\") pod \"openshift-controller-manager-operator-756b6f6bc6-cbz9v\" (UID: \"10b0a93a-9b8f-48d1-bdc2-defe765f1fab\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cbz9v" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.249490 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gc5z6\" (UID: \"2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gc5z6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.265271 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59lrk\" (UniqueName: \"kubernetes.io/projected/2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474-kube-api-access-59lrk\") pod \"cluster-image-registry-operator-dc59b4c8b-gc5z6\" (UID: \"2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gc5z6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.265901 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-d5kqn" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.276260 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cbz9v" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.279323 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7tsk\" (UniqueName: \"kubernetes.io/projected/27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30-kube-api-access-m7tsk\") pod \"machine-api-operator-5694c8668f-rw728\" (UID: \"27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rw728" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.301032 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6v25\" (UniqueName: \"kubernetes.io/projected/eb040b17-af49-45f7-9405-2f75256dbbe4-kube-api-access-g6v25\") pod \"openshift-apiserver-operator-796bbdcf4f-xm8pb\" (UID: \"eb040b17-af49-45f7-9405-2f75256dbbe4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xm8pb" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.302633 4936 request.go:700] Waited for 1.909718111s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver/serviceaccounts/openshift-apiserver-sa/token Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.321839 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.325514 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-jl85m"] Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.325689 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.330130 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhl4g\" (UniqueName: \"kubernetes.io/projected/a641ed11-580d-41ac-967c-e145d80b03fa-kube-api-access-xhl4g\") pod \"apiserver-76f77b778f-hj57l\" (UID: \"a641ed11-580d-41ac-967c-e145d80b03fa\") " pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.344152 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.347940 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:41:25 crc kubenswrapper[4936]: W0930 13:41:25.353554 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7e5e231_b700_4151_81c8_111a3af3bfc2.slice/crio-a4068dfc77dfcc96ad8374b6c6fefb3e7dc3264c1810da086406c939573573dc WatchSource:0}: Error finding container a4068dfc77dfcc96ad8374b6c6fefb3e7dc3264c1810da086406c939573573dc: Status 404 returned error can't find the container with id a4068dfc77dfcc96ad8374b6c6fefb3e7dc3264c1810da086406c939573573dc Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.366922 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.396013 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.399729 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.407198 4936 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.425895 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.447023 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gc5z6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.447367 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.476441 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.485011 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.520196 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv8nd\" (UniqueName: \"kubernetes.io/projected/33d9470f-af56-43d0-9d14-90930c95615a-kube-api-access-kv8nd\") pod \"kube-storage-version-migrator-operator-b67b599dd-9mpz7\" (UID: \"33d9470f-af56-43d0-9d14-90930c95615a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mpz7" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.543192 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqzt5\" (UniqueName: \"kubernetes.io/projected/2ea3912a-d1e4-4c08-81d1-3788ce6096e6-kube-api-access-kqzt5\") pod \"router-default-5444994796-sht9l\" (UID: \"2ea3912a-d1e4-4c08-81d1-3788ce6096e6\") " pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.546496 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-rw728" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.559454 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xm8pb" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.560291 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlqzb\" (UniqueName: \"kubernetes.io/projected/2667a269-9771-4873-8ed1-6781e6aab9bf-kube-api-access-nlqzb\") pod \"multus-admission-controller-857f4d67dd-nldp8\" (UID: \"2667a269-9771-4873-8ed1-6781e6aab9bf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nldp8" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.582948 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmmkf\" (UniqueName: \"kubernetes.io/projected/9881cb2e-950d-4550-b04a-8254d9581cd1-kube-api-access-fmmkf\") pod \"migrator-59844c95c7-wd2th\" (UID: \"9881cb2e-950d-4550-b04a-8254d9581cd1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wd2th" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.607887 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4f46\" (UniqueName: \"kubernetes.io/projected/1664a2e3-61da-4eea-bdba-06b422cbb9b6-kube-api-access-h4f46\") pod \"apiserver-7bbb656c7d-46l8f\" (UID: \"1664a2e3-61da-4eea-bdba-06b422cbb9b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.608093 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.621170 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-djmjn"] Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.623938 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lzfn\" (UniqueName: \"kubernetes.io/projected/c2ccd121-a5fc-4f11-b256-70b80420eb21-kube-api-access-7lzfn\") pod \"console-operator-58897d9998-6rwjr\" (UID: \"c2ccd121-a5fc-4f11-b256-70b80420eb21\") " pod="openshift-console-operator/console-operator-58897d9998-6rwjr" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.638447 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.651267 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.655779 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-2nljs\" (UID: \"7cd84bc0-8b2a-45a0-bda3-e1cad2d795df\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.658764 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nfcbn"] Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.664392 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.687622 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-nldp8" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.687664 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.694356 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mpz7" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.699904 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87x2g\" (UniqueName: \"kubernetes.io/projected/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-kube-api-access-87x2g\") pod \"authentication-operator-69f744f599-2nljs\" (UID: \"7cd84bc0-8b2a-45a0-bda3-e1cad2d795df\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.699951 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wd2th" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.704024 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.712141 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg"] Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.720665 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7ccwq\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.723653 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.730670 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-serving-cert\") pod \"authentication-operator-69f744f599-2nljs\" (UID: \"7cd84bc0-8b2a-45a0-bda3-e1cad2d795df\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" Sep 30 13:41:25 crc kubenswrapper[4936]: W0930 13:41:25.730942 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ea3912a_d1e4_4c08_81d1_3788ce6096e6.slice/crio-eec6d7d68b207f701685730e63e34f86891af561fd6a91b0fc9d352b1af4c4fc WatchSource:0}: Error finding container eec6d7d68b207f701685730e63e34f86891af561fd6a91b0fc9d352b1af4c4fc: Status 404 returned error can't find the container with id eec6d7d68b207f701685730e63e34f86891af561fd6a91b0fc9d352b1af4c4fc Sep 30 13:41:25 crc kubenswrapper[4936]: W0930 13:41:25.736628 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c8003ab_2870_41d5_a1c7_30dd4232d184.slice/crio-e1632bde4e125a1e68737ada0a4eee5143b8a178860442260a56e7d16f91c492 WatchSource:0}: Error finding container e1632bde4e125a1e68737ada0a4eee5143b8a178860442260a56e7d16f91c492: Status 404 returned error can't find the container with id e1632bde4e125a1e68737ada0a4eee5143b8a178860442260a56e7d16f91c492 Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.738065 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cbz9v"] Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.747756 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.768599 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.776304 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cd84bc0-8b2a-45a0-bda3-e1cad2d795df-service-ca-bundle\") pod \"authentication-operator-69f744f599-2nljs\" (UID: \"7cd84bc0-8b2a-45a0-bda3-e1cad2d795df\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.788711 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl"] Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.793359 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d5kqn"] Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.805150 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.818530 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e16bbb17-ee43-4b34-885f-0e042fcde913-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-njdd2\" (UID: \"e16bbb17-ee43-4b34-885f-0e042fcde913\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-njdd2" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.842470 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.857967 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-njdd2" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.858460 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6zjk\" (UniqueName: \"kubernetes.io/projected/a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f-kube-api-access-x6zjk\") pod \"marketplace-operator-79b997595-7jxz9\" (UID: \"a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.858654 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/29789917-34b7-4bc9-8b49-a00b5055f092-images\") pod \"machine-config-operator-74547568cd-bdtws\" (UID: \"29789917-34b7-4bc9-8b49-a00b5055f092\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.858681 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/28f20487-8292-428a-b17e-2bb85ad6a0d6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-c49vt\" (UID: \"28f20487-8292-428a-b17e-2bb85ad6a0d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c49vt" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.858725 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54105c99-d0f2-4653-b0b5-e3af3b8a118a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-29b8m\" (UID: \"54105c99-d0f2-4653-b0b5-e3af3b8a118a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-29b8m" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.858760 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a47be60-1d82-451e-a0d4-7dfa3fcf3326-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-cqd7b\" (UID: \"8a47be60-1d82-451e-a0d4-7dfa3fcf3326\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqd7b" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.858774 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd49f3ee-bb75-4a88-9b5d-207de98a4d0b-etcd-service-ca\") pod \"etcd-operator-b45778765-fg254\" (UID: \"fd49f3ee-bb75-4a88-9b5d-207de98a4d0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.858789 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8pkq\" (UniqueName: \"kubernetes.io/projected/b6442e1c-e629-43df-b0ad-adf82f384a6a-kube-api-access-h8pkq\") pod \"catalog-operator-68c6474976-6wchs\" (UID: \"b6442e1c-e629-43df-b0ad-adf82f384a6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6wchs" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.858813 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc7jg\" (UniqueName: \"kubernetes.io/projected/6d8ca06e-ea42-4678-8d28-dcd11b4dd1ce-kube-api-access-tc7jg\") pod \"control-plane-machine-set-operator-78cbb6b69f-6mxph\" (UID: \"6d8ca06e-ea42-4678-8d28-dcd11b4dd1ce\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6mxph" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.858869 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssl5j\" (UniqueName: \"kubernetes.io/projected/3d9b017e-f89a-410c-a6f1-7db0bc934b79-kube-api-access-ssl5j\") pod \"ingress-operator-5b745b69d9-g69gg\" (UID: \"3d9b017e-f89a-410c-a6f1-7db0bc934b79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.858884 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bb902ae-877a-46b0-8972-2ea22f50782c-trusted-ca\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.858897 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8bb902ae-877a-46b0-8972-2ea22f50782c-registry-certificates\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.858922 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/29789917-34b7-4bc9-8b49-a00b5055f092-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bdtws\" (UID: \"29789917-34b7-4bc9-8b49-a00b5055f092\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.858936 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7jxz9\" (UID: \"a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.858952 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d8ca06e-ea42-4678-8d28-dcd11b4dd1ce-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-6mxph\" (UID: \"6d8ca06e-ea42-4678-8d28-dcd11b4dd1ce\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6mxph" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.858966 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd49f3ee-bb75-4a88-9b5d-207de98a4d0b-serving-cert\") pod \"etcd-operator-b45778765-fg254\" (UID: \"fd49f3ee-bb75-4a88-9b5d-207de98a4d0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.858982 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwgwj\" (UniqueName: \"kubernetes.io/projected/d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e-kube-api-access-fwgwj\") pod \"olm-operator-6b444d44fb-f8s8f\" (UID: \"d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f8s8f" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.858997 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfqvn\" (UniqueName: \"kubernetes.io/projected/28f20487-8292-428a-b17e-2bb85ad6a0d6-kube-api-access-tfqvn\") pod \"machine-config-controller-84d6567774-c49vt\" (UID: \"28f20487-8292-428a-b17e-2bb85ad6a0d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c49vt" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859012 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcwdd\" (UniqueName: \"kubernetes.io/projected/a231f851-06a5-4424-8631-04c71f2ccf7a-kube-api-access-hcwdd\") pod \"package-server-manager-789f6589d5-v22qc\" (UID: \"a231f851-06a5-4424-8631-04c71f2ccf7a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v22qc" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859026 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54105c99-d0f2-4653-b0b5-e3af3b8a118a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-29b8m\" (UID: \"54105c99-d0f2-4653-b0b5-e3af3b8a118a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-29b8m" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859042 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54105c99-d0f2-4653-b0b5-e3af3b8a118a-config\") pod \"kube-controller-manager-operator-78b949d7b-29b8m\" (UID: \"54105c99-d0f2-4653-b0b5-e3af3b8a118a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-29b8m" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859057 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-f8s8f\" (UID: \"d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f8s8f" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859070 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d9b017e-f89a-410c-a6f1-7db0bc934b79-trusted-ca\") pod \"ingress-operator-5b745b69d9-g69gg\" (UID: \"3d9b017e-f89a-410c-a6f1-7db0bc934b79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859097 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvxwl\" (UniqueName: \"kubernetes.io/projected/8bb902ae-877a-46b0-8972-2ea22f50782c-kube-api-access-gvxwl\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859110 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7jxz9\" (UID: \"a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859125 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6442e1c-e629-43df-b0ad-adf82f384a6a-srv-cert\") pod \"catalog-operator-68c6474976-6wchs\" (UID: \"b6442e1c-e629-43df-b0ad-adf82f384a6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6wchs" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859169 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29789917-34b7-4bc9-8b49-a00b5055f092-proxy-tls\") pod \"machine-config-operator-74547568cd-bdtws\" (UID: \"29789917-34b7-4bc9-8b49-a00b5055f092\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859196 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a47be60-1d82-451e-a0d4-7dfa3fcf3326-config\") pod \"kube-apiserver-operator-766d6c64bb-cqd7b\" (UID: \"8a47be60-1d82-451e-a0d4-7dfa3fcf3326\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqd7b" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859210 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqtxh\" (UniqueName: \"kubernetes.io/projected/29789917-34b7-4bc9-8b49-a00b5055f092-kube-api-access-wqtxh\") pod \"machine-config-operator-74547568cd-bdtws\" (UID: \"29789917-34b7-4bc9-8b49-a00b5055f092\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859233 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8bb902ae-877a-46b0-8972-2ea22f50782c-bound-sa-token\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859258 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ff5368c-b6b0-4e04-83db-314de1999000-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75ftv\" (UID: \"6ff5368c-b6b0-4e04-83db-314de1999000\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75ftv" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859272 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6442e1c-e629-43df-b0ad-adf82f384a6a-profile-collector-cert\") pod \"catalog-operator-68c6474976-6wchs\" (UID: \"b6442e1c-e629-43df-b0ad-adf82f384a6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6wchs" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859287 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ff5368c-b6b0-4e04-83db-314de1999000-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75ftv\" (UID: \"6ff5368c-b6b0-4e04-83db-314de1999000\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75ftv" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859349 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8bb902ae-877a-46b0-8972-2ea22f50782c-registry-tls\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859374 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d9b017e-f89a-410c-a6f1-7db0bc934b79-metrics-tls\") pod \"ingress-operator-5b745b69d9-g69gg\" (UID: \"3d9b017e-f89a-410c-a6f1-7db0bc934b79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859410 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ff5368c-b6b0-4e04-83db-314de1999000-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75ftv\" (UID: \"6ff5368c-b6b0-4e04-83db-314de1999000\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75ftv" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859451 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a231f851-06a5-4424-8631-04c71f2ccf7a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-v22qc\" (UID: \"a231f851-06a5-4424-8631-04c71f2ccf7a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v22qc" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859472 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a47be60-1d82-451e-a0d4-7dfa3fcf3326-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-cqd7b\" (UID: \"8a47be60-1d82-451e-a0d4-7dfa3fcf3326\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqd7b" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859495 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8bb902ae-877a-46b0-8972-2ea22f50782c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.859516 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e-srv-cert\") pod \"olm-operator-6b444d44fb-f8s8f\" (UID: \"d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f8s8f" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.860826 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd49f3ee-bb75-4a88-9b5d-207de98a4d0b-config\") pod \"etcd-operator-b45778765-fg254\" (UID: \"fd49f3ee-bb75-4a88-9b5d-207de98a4d0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.860850 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8bb902ae-877a-46b0-8972-2ea22f50782c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.860892 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.860915 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fd49f3ee-bb75-4a88-9b5d-207de98a4d0b-etcd-client\") pod \"etcd-operator-b45778765-fg254\" (UID: \"fd49f3ee-bb75-4a88-9b5d-207de98a4d0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.860930 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qmnm\" (UniqueName: \"kubernetes.io/projected/fd49f3ee-bb75-4a88-9b5d-207de98a4d0b-kube-api-access-8qmnm\") pod \"etcd-operator-b45778765-fg254\" (UID: \"fd49f3ee-bb75-4a88-9b5d-207de98a4d0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.860955 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28f20487-8292-428a-b17e-2bb85ad6a0d6-proxy-tls\") pod \"machine-config-controller-84d6567774-c49vt\" (UID: \"28f20487-8292-428a-b17e-2bb85ad6a0d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c49vt" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.860970 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fd49f3ee-bb75-4a88-9b5d-207de98a4d0b-etcd-ca\") pod \"etcd-operator-b45778765-fg254\" (UID: \"fd49f3ee-bb75-4a88-9b5d-207de98a4d0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.861008 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d9b017e-f89a-410c-a6f1-7db0bc934b79-bound-sa-token\") pod \"ingress-operator-5b745b69d9-g69gg\" (UID: \"3d9b017e-f89a-410c-a6f1-7db0bc934b79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg" Sep 30 13:41:25 crc kubenswrapper[4936]: E0930 13:41:25.862757 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:26.362744814 +0000 UTC m=+136.746747115 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.888863 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.890773 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.920183 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6rwjr" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.920305 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hj57l"] Sep 30 13:41:25 crc kubenswrapper[4936]: W0930 13:41:25.954164 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda641ed11_580d_41ac_967c_e145d80b03fa.slice/crio-a2830146cdb503e33d90a62d423e0aa6426958233820170695da7701f4e2129b WatchSource:0}: Error finding container a2830146cdb503e33d90a62d423e0aa6426958233820170695da7701f4e2129b: Status 404 returned error can't find the container with id a2830146cdb503e33d90a62d423e0aa6426958233820170695da7701f4e2129b Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.957729 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gc5z6"] Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962320 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962491 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd49f3ee-bb75-4a88-9b5d-207de98a4d0b-config\") pod \"etcd-operator-b45778765-fg254\" (UID: \"fd49f3ee-bb75-4a88-9b5d-207de98a4d0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962535 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8bb902ae-877a-46b0-8972-2ea22f50782c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962570 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/80b740c7-e07f-4ca7-be4e-ef5825f6eb24-csi-data-dir\") pod \"csi-hostpathplugin-mqxlc\" (UID: \"80b740c7-e07f-4ca7-be4e-ef5825f6eb24\") " pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962589 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qmnm\" (UniqueName: \"kubernetes.io/projected/fd49f3ee-bb75-4a88-9b5d-207de98a4d0b-kube-api-access-8qmnm\") pod \"etcd-operator-b45778765-fg254\" (UID: \"fd49f3ee-bb75-4a88-9b5d-207de98a4d0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962605 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp5vl\" (UniqueName: \"kubernetes.io/projected/661fcd49-29e9-4299-8fa7-9696bb5d1944-kube-api-access-dp5vl\") pod \"collect-profiles-29320650-cqhgz\" (UID: \"661fcd49-29e9-4299-8fa7-9696bb5d1944\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962627 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fd49f3ee-bb75-4a88-9b5d-207de98a4d0b-etcd-client\") pod \"etcd-operator-b45778765-fg254\" (UID: \"fd49f3ee-bb75-4a88-9b5d-207de98a4d0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962642 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28f20487-8292-428a-b17e-2bb85ad6a0d6-proxy-tls\") pod \"machine-config-controller-84d6567774-c49vt\" (UID: \"28f20487-8292-428a-b17e-2bb85ad6a0d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c49vt" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962658 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fd49f3ee-bb75-4a88-9b5d-207de98a4d0b-etcd-ca\") pod \"etcd-operator-b45778765-fg254\" (UID: \"fd49f3ee-bb75-4a88-9b5d-207de98a4d0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962688 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d9b017e-f89a-410c-a6f1-7db0bc934b79-bound-sa-token\") pod \"ingress-operator-5b745b69d9-g69gg\" (UID: \"3d9b017e-f89a-410c-a6f1-7db0bc934b79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962706 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rzwn\" (UniqueName: \"kubernetes.io/projected/80b740c7-e07f-4ca7-be4e-ef5825f6eb24-kube-api-access-9rzwn\") pod \"csi-hostpathplugin-mqxlc\" (UID: \"80b740c7-e07f-4ca7-be4e-ef5825f6eb24\") " pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962755 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6zjk\" (UniqueName: \"kubernetes.io/projected/a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f-kube-api-access-x6zjk\") pod \"marketplace-operator-79b997595-7jxz9\" (UID: \"a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962773 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/29789917-34b7-4bc9-8b49-a00b5055f092-images\") pod \"machine-config-operator-74547568cd-bdtws\" (UID: \"29789917-34b7-4bc9-8b49-a00b5055f092\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962789 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/28f20487-8292-428a-b17e-2bb85ad6a0d6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-c49vt\" (UID: \"28f20487-8292-428a-b17e-2bb85ad6a0d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c49vt" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962804 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a0172e4-1900-4476-bc0d-b2bf5cd281e5-cert\") pod \"ingress-canary-plg5s\" (UID: \"6a0172e4-1900-4476-bc0d-b2bf5cd281e5\") " pod="openshift-ingress-canary/ingress-canary-plg5s" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962839 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54105c99-d0f2-4653-b0b5-e3af3b8a118a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-29b8m\" (UID: \"54105c99-d0f2-4653-b0b5-e3af3b8a118a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-29b8m" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962868 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9d8cedf4-9075-4715-9248-53ad5d391a28-tmpfs\") pod \"packageserver-d55dfcdfc-hbzpb\" (UID: \"9d8cedf4-9075-4715-9248-53ad5d391a28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962884 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a47be60-1d82-451e-a0d4-7dfa3fcf3326-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-cqd7b\" (UID: \"8a47be60-1d82-451e-a0d4-7dfa3fcf3326\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqd7b" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962900 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aff95854-e88b-463e-af90-ed33ab4c24bf-config-volume\") pod \"dns-default-bjbnj\" (UID: \"aff95854-e88b-463e-af90-ed33ab4c24bf\") " pod="openshift-dns/dns-default-bjbnj" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962915 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/80b740c7-e07f-4ca7-be4e-ef5825f6eb24-mountpoint-dir\") pod \"csi-hostpathplugin-mqxlc\" (UID: \"80b740c7-e07f-4ca7-be4e-ef5825f6eb24\") " pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962959 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd49f3ee-bb75-4a88-9b5d-207de98a4d0b-etcd-service-ca\") pod \"etcd-operator-b45778765-fg254\" (UID: \"fd49f3ee-bb75-4a88-9b5d-207de98a4d0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.962975 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8pkq\" (UniqueName: \"kubernetes.io/projected/b6442e1c-e629-43df-b0ad-adf82f384a6a-kube-api-access-h8pkq\") pod \"catalog-operator-68c6474976-6wchs\" (UID: \"b6442e1c-e629-43df-b0ad-adf82f384a6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6wchs" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.963007 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzq94\" (UniqueName: \"kubernetes.io/projected/28e6aa1b-b20a-4017-9659-a36c70e1484b-kube-api-access-gzq94\") pod \"service-ca-operator-777779d784-7jh2g\" (UID: \"28e6aa1b-b20a-4017-9659-a36c70e1484b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7jh2g" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.963038 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc7jg\" (UniqueName: \"kubernetes.io/projected/6d8ca06e-ea42-4678-8d28-dcd11b4dd1ce-kube-api-access-tc7jg\") pod \"control-plane-machine-set-operator-78cbb6b69f-6mxph\" (UID: \"6d8ca06e-ea42-4678-8d28-dcd11b4dd1ce\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6mxph" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.963068 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2e3dd5b8-f86f-48c8-9be1-ab8bf2dfc9ef-node-bootstrap-token\") pod \"machine-config-server-xz7g4\" (UID: \"2e3dd5b8-f86f-48c8-9be1-ab8bf2dfc9ef\") " pod="openshift-machine-config-operator/machine-config-server-xz7g4" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.963084 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssl5j\" (UniqueName: \"kubernetes.io/projected/3d9b017e-f89a-410c-a6f1-7db0bc934b79-kube-api-access-ssl5j\") pod \"ingress-operator-5b745b69d9-g69gg\" (UID: \"3d9b017e-f89a-410c-a6f1-7db0bc934b79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.963101 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2e3dd5b8-f86f-48c8-9be1-ab8bf2dfc9ef-certs\") pod \"machine-config-server-xz7g4\" (UID: \"2e3dd5b8-f86f-48c8-9be1-ab8bf2dfc9ef\") " pod="openshift-machine-config-operator/machine-config-server-xz7g4" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.963127 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh5jh\" (UniqueName: \"kubernetes.io/projected/2e3dd5b8-f86f-48c8-9be1-ab8bf2dfc9ef-kube-api-access-bh5jh\") pod \"machine-config-server-xz7g4\" (UID: \"2e3dd5b8-f86f-48c8-9be1-ab8bf2dfc9ef\") " pod="openshift-machine-config-operator/machine-config-server-xz7g4" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.963152 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bb902ae-877a-46b0-8972-2ea22f50782c-trusted-ca\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.963175 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/80b740c7-e07f-4ca7-be4e-ef5825f6eb24-socket-dir\") pod \"csi-hostpathplugin-mqxlc\" (UID: \"80b740c7-e07f-4ca7-be4e-ef5825f6eb24\") " pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" Sep 30 13:41:25 crc kubenswrapper[4936]: E0930 13:41:25.964107 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:26.46408594 +0000 UTC m=+136.848088241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964205 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8bb902ae-877a-46b0-8972-2ea22f50782c-registry-certificates\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964231 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/80b740c7-e07f-4ca7-be4e-ef5825f6eb24-plugins-dir\") pod \"csi-hostpathplugin-mqxlc\" (UID: \"80b740c7-e07f-4ca7-be4e-ef5825f6eb24\") " pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964258 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28e6aa1b-b20a-4017-9659-a36c70e1484b-serving-cert\") pod \"service-ca-operator-777779d784-7jh2g\" (UID: \"28e6aa1b-b20a-4017-9659-a36c70e1484b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7jh2g" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964279 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/29789917-34b7-4bc9-8b49-a00b5055f092-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bdtws\" (UID: \"29789917-34b7-4bc9-8b49-a00b5055f092\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964296 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7jxz9\" (UID: \"a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964313 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0e7e66ba-3c9e-476f-9bfc-d4f11d95d195-signing-key\") pod \"service-ca-9c57cc56f-n8p2c\" (UID: \"0e7e66ba-3c9e-476f-9bfc-d4f11d95d195\") " pod="openshift-service-ca/service-ca-9c57cc56f-n8p2c" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964343 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd49f3ee-bb75-4a88-9b5d-207de98a4d0b-serving-cert\") pod \"etcd-operator-b45778765-fg254\" (UID: \"fd49f3ee-bb75-4a88-9b5d-207de98a4d0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964370 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d8ca06e-ea42-4678-8d28-dcd11b4dd1ce-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-6mxph\" (UID: \"6d8ca06e-ea42-4678-8d28-dcd11b4dd1ce\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6mxph" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964408 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwgwj\" (UniqueName: \"kubernetes.io/projected/d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e-kube-api-access-fwgwj\") pod \"olm-operator-6b444d44fb-f8s8f\" (UID: \"d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f8s8f" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964444 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfqvn\" (UniqueName: \"kubernetes.io/projected/28f20487-8292-428a-b17e-2bb85ad6a0d6-kube-api-access-tfqvn\") pod \"machine-config-controller-84d6567774-c49vt\" (UID: \"28f20487-8292-428a-b17e-2bb85ad6a0d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c49vt" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964459 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54105c99-d0f2-4653-b0b5-e3af3b8a118a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-29b8m\" (UID: \"54105c99-d0f2-4653-b0b5-e3af3b8a118a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-29b8m" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964473 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/661fcd49-29e9-4299-8fa7-9696bb5d1944-config-volume\") pod \"collect-profiles-29320650-cqhgz\" (UID: \"661fcd49-29e9-4299-8fa7-9696bb5d1944\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964499 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcwdd\" (UniqueName: \"kubernetes.io/projected/a231f851-06a5-4424-8631-04c71f2ccf7a-kube-api-access-hcwdd\") pod \"package-server-manager-789f6589d5-v22qc\" (UID: \"a231f851-06a5-4424-8631-04c71f2ccf7a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v22qc" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964515 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54105c99-d0f2-4653-b0b5-e3af3b8a118a-config\") pod \"kube-controller-manager-operator-78b949d7b-29b8m\" (UID: \"54105c99-d0f2-4653-b0b5-e3af3b8a118a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-29b8m" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964533 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e6aa1b-b20a-4017-9659-a36c70e1484b-config\") pod \"service-ca-operator-777779d784-7jh2g\" (UID: \"28e6aa1b-b20a-4017-9659-a36c70e1484b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7jh2g" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964560 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-f8s8f\" (UID: \"d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f8s8f" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964576 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d9b017e-f89a-410c-a6f1-7db0bc934b79-trusted-ca\") pod \"ingress-operator-5b745b69d9-g69gg\" (UID: \"3d9b017e-f89a-410c-a6f1-7db0bc934b79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964590 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6442e1c-e629-43df-b0ad-adf82f384a6a-srv-cert\") pod \"catalog-operator-68c6474976-6wchs\" (UID: \"b6442e1c-e629-43df-b0ad-adf82f384a6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6wchs" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964605 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/661fcd49-29e9-4299-8fa7-9696bb5d1944-secret-volume\") pod \"collect-profiles-29320650-cqhgz\" (UID: \"661fcd49-29e9-4299-8fa7-9696bb5d1944\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964621 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvxwl\" (UniqueName: \"kubernetes.io/projected/8bb902ae-877a-46b0-8972-2ea22f50782c-kube-api-access-gvxwl\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964638 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7jxz9\" (UID: \"a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964653 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29789917-34b7-4bc9-8b49-a00b5055f092-proxy-tls\") pod \"machine-config-operator-74547568cd-bdtws\" (UID: \"29789917-34b7-4bc9-8b49-a00b5055f092\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964671 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd49f3ee-bb75-4a88-9b5d-207de98a4d0b-config\") pod \"etcd-operator-b45778765-fg254\" (UID: \"fd49f3ee-bb75-4a88-9b5d-207de98a4d0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964695 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a47be60-1d82-451e-a0d4-7dfa3fcf3326-config\") pod \"kube-apiserver-operator-766d6c64bb-cqd7b\" (UID: \"8a47be60-1d82-451e-a0d4-7dfa3fcf3326\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqd7b" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964714 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqtxh\" (UniqueName: \"kubernetes.io/projected/29789917-34b7-4bc9-8b49-a00b5055f092-kube-api-access-wqtxh\") pod \"machine-config-operator-74547568cd-bdtws\" (UID: \"29789917-34b7-4bc9-8b49-a00b5055f092\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964731 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d8cedf4-9075-4715-9248-53ad5d391a28-apiservice-cert\") pod \"packageserver-d55dfcdfc-hbzpb\" (UID: \"9d8cedf4-9075-4715-9248-53ad5d391a28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964760 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aff95854-e88b-463e-af90-ed33ab4c24bf-metrics-tls\") pod \"dns-default-bjbnj\" (UID: \"aff95854-e88b-463e-af90-ed33ab4c24bf\") " pod="openshift-dns/dns-default-bjbnj" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964776 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpsm5\" (UniqueName: \"kubernetes.io/projected/9d8cedf4-9075-4715-9248-53ad5d391a28-kube-api-access-kpsm5\") pod \"packageserver-d55dfcdfc-hbzpb\" (UID: \"9d8cedf4-9075-4715-9248-53ad5d391a28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964812 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8bb902ae-877a-46b0-8972-2ea22f50782c-bound-sa-token\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964829 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ff5368c-b6b0-4e04-83db-314de1999000-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75ftv\" (UID: \"6ff5368c-b6b0-4e04-83db-314de1999000\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75ftv" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964844 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6442e1c-e629-43df-b0ad-adf82f384a6a-profile-collector-cert\") pod \"catalog-operator-68c6474976-6wchs\" (UID: \"b6442e1c-e629-43df-b0ad-adf82f384a6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6wchs" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.964859 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ff5368c-b6b0-4e04-83db-314de1999000-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75ftv\" (UID: \"6ff5368c-b6b0-4e04-83db-314de1999000\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75ftv" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.967538 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54105c99-d0f2-4653-b0b5-e3af3b8a118a-config\") pod \"kube-controller-manager-operator-78b949d7b-29b8m\" (UID: \"54105c99-d0f2-4653-b0b5-e3af3b8a118a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-29b8m" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.969188 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bb902ae-877a-46b0-8972-2ea22f50782c-trusted-ca\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.970727 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6d8ca06e-ea42-4678-8d28-dcd11b4dd1ce-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-6mxph\" (UID: \"6d8ca06e-ea42-4678-8d28-dcd11b4dd1ce\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6mxph" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.971408 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-f8s8f\" (UID: \"d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f8s8f" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.973382 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2tsp\" (UniqueName: \"kubernetes.io/projected/6a0172e4-1900-4476-bc0d-b2bf5cd281e5-kube-api-access-n2tsp\") pod \"ingress-canary-plg5s\" (UID: \"6a0172e4-1900-4476-bc0d-b2bf5cd281e5\") " pod="openshift-ingress-canary/ingress-canary-plg5s" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.973426 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fd49f3ee-bb75-4a88-9b5d-207de98a4d0b-etcd-client\") pod \"etcd-operator-b45778765-fg254\" (UID: \"fd49f3ee-bb75-4a88-9b5d-207de98a4d0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.973471 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0e7e66ba-3c9e-476f-9bfc-d4f11d95d195-signing-cabundle\") pod \"service-ca-9c57cc56f-n8p2c\" (UID: \"0e7e66ba-3c9e-476f-9bfc-d4f11d95d195\") " pod="openshift-service-ca/service-ca-9c57cc56f-n8p2c" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.973532 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d9b017e-f89a-410c-a6f1-7db0bc934b79-metrics-tls\") pod \"ingress-operator-5b745b69d9-g69gg\" (UID: \"3d9b017e-f89a-410c-a6f1-7db0bc934b79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.973585 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8bb902ae-877a-46b0-8972-2ea22f50782c-registry-tls\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.973644 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ff5368c-b6b0-4e04-83db-314de1999000-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75ftv\" (UID: \"6ff5368c-b6b0-4e04-83db-314de1999000\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75ftv" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.973729 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/80b740c7-e07f-4ca7-be4e-ef5825f6eb24-registration-dir\") pod \"csi-hostpathplugin-mqxlc\" (UID: \"80b740c7-e07f-4ca7-be4e-ef5825f6eb24\") " pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.973759 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nflz\" (UniqueName: \"kubernetes.io/projected/aff95854-e88b-463e-af90-ed33ab4c24bf-kube-api-access-5nflz\") pod \"dns-default-bjbnj\" (UID: \"aff95854-e88b-463e-af90-ed33ab4c24bf\") " pod="openshift-dns/dns-default-bjbnj" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.973921 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8bb902ae-877a-46b0-8972-2ea22f50782c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.973957 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a231f851-06a5-4424-8631-04c71f2ccf7a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-v22qc\" (UID: \"a231f851-06a5-4424-8631-04c71f2ccf7a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v22qc" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.973980 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a47be60-1d82-451e-a0d4-7dfa3fcf3326-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-cqd7b\" (UID: \"8a47be60-1d82-451e-a0d4-7dfa3fcf3326\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqd7b" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.974006 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d8cedf4-9075-4715-9248-53ad5d391a28-webhook-cert\") pod \"packageserver-d55dfcdfc-hbzpb\" (UID: \"9d8cedf4-9075-4715-9248-53ad5d391a28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.974028 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e-srv-cert\") pod \"olm-operator-6b444d44fb-f8s8f\" (UID: \"d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f8s8f" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.974064 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2dcp\" (UniqueName: \"kubernetes.io/projected/0e7e66ba-3c9e-476f-9bfc-d4f11d95d195-kube-api-access-d2dcp\") pod \"service-ca-9c57cc56f-n8p2c\" (UID: \"0e7e66ba-3c9e-476f-9bfc-d4f11d95d195\") " pod="openshift-service-ca/service-ca-9c57cc56f-n8p2c" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.974274 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7jxz9\" (UID: \"a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.975614 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/29789917-34b7-4bc9-8b49-a00b5055f092-images\") pod \"machine-config-operator-74547568cd-bdtws\" (UID: \"29789917-34b7-4bc9-8b49-a00b5055f092\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.976795 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd49f3ee-bb75-4a88-9b5d-207de98a4d0b-etcd-service-ca\") pod \"etcd-operator-b45778765-fg254\" (UID: \"fd49f3ee-bb75-4a88-9b5d-207de98a4d0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.977678 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fd49f3ee-bb75-4a88-9b5d-207de98a4d0b-etcd-ca\") pod \"etcd-operator-b45778765-fg254\" (UID: \"fd49f3ee-bb75-4a88-9b5d-207de98a4d0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.977824 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/28f20487-8292-428a-b17e-2bb85ad6a0d6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-c49vt\" (UID: \"28f20487-8292-428a-b17e-2bb85ad6a0d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c49vt" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.981299 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8bb902ae-877a-46b0-8972-2ea22f50782c-registry-certificates\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.982492 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a47be60-1d82-451e-a0d4-7dfa3fcf3326-config\") pod \"kube-apiserver-operator-766d6c64bb-cqd7b\" (UID: \"8a47be60-1d82-451e-a0d4-7dfa3fcf3326\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqd7b" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.983326 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d9b017e-f89a-410c-a6f1-7db0bc934b79-trusted-ca\") pod \"ingress-operator-5b745b69d9-g69gg\" (UID: \"3d9b017e-f89a-410c-a6f1-7db0bc934b79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.988079 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28f20487-8292-428a-b17e-2bb85ad6a0d6-proxy-tls\") pod \"machine-config-controller-84d6567774-c49vt\" (UID: \"28f20487-8292-428a-b17e-2bb85ad6a0d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c49vt" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.988302 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/29789917-34b7-4bc9-8b49-a00b5055f092-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bdtws\" (UID: \"29789917-34b7-4bc9-8b49-a00b5055f092\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.990028 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a231f851-06a5-4424-8631-04c71f2ccf7a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-v22qc\" (UID: \"a231f851-06a5-4424-8631-04c71f2ccf7a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v22qc" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.993743 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54105c99-d0f2-4653-b0b5-e3af3b8a118a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-29b8m\" (UID: \"54105c99-d0f2-4653-b0b5-e3af3b8a118a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-29b8m" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.994916 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e-srv-cert\") pod \"olm-operator-6b444d44fb-f8s8f\" (UID: \"d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f8s8f" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.996076 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ff5368c-b6b0-4e04-83db-314de1999000-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75ftv\" (UID: \"6ff5368c-b6b0-4e04-83db-314de1999000\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75ftv" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.996423 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a47be60-1d82-451e-a0d4-7dfa3fcf3326-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-cqd7b\" (UID: \"8a47be60-1d82-451e-a0d4-7dfa3fcf3326\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqd7b" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.996639 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6442e1c-e629-43df-b0ad-adf82f384a6a-srv-cert\") pod \"catalog-operator-68c6474976-6wchs\" (UID: \"b6442e1c-e629-43df-b0ad-adf82f384a6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6wchs" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.996823 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8bb902ae-877a-46b0-8972-2ea22f50782c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.997244 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd49f3ee-bb75-4a88-9b5d-207de98a4d0b-serving-cert\") pod \"etcd-operator-b45778765-fg254\" (UID: \"fd49f3ee-bb75-4a88-9b5d-207de98a4d0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" Sep 30 13:41:25 crc kubenswrapper[4936]: I0930 13:41:25.998414 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6442e1c-e629-43df-b0ad-adf82f384a6a-profile-collector-cert\") pod \"catalog-operator-68c6474976-6wchs\" (UID: \"b6442e1c-e629-43df-b0ad-adf82f384a6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6wchs" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.007877 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d9b017e-f89a-410c-a6f1-7db0bc934b79-metrics-tls\") pod \"ingress-operator-5b745b69d9-g69gg\" (UID: \"3d9b017e-f89a-410c-a6f1-7db0bc934b79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.008426 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7jxz9\" (UID: \"a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.012250 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8bb902ae-877a-46b0-8972-2ea22f50782c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.015240 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8bb902ae-877a-46b0-8972-2ea22f50782c-registry-tls\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.015900 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29789917-34b7-4bc9-8b49-a00b5055f092-proxy-tls\") pod \"machine-config-operator-74547568cd-bdtws\" (UID: \"29789917-34b7-4bc9-8b49-a00b5055f092\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.016147 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ff5368c-b6b0-4e04-83db-314de1999000-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75ftv\" (UID: \"6ff5368c-b6b0-4e04-83db-314de1999000\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75ftv" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.024051 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6zjk\" (UniqueName: \"kubernetes.io/projected/a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f-kube-api-access-x6zjk\") pod \"marketplace-operator-79b997595-7jxz9\" (UID: \"a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.027199 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8pkq\" (UniqueName: \"kubernetes.io/projected/b6442e1c-e629-43df-b0ad-adf82f384a6a-kube-api-access-h8pkq\") pod \"catalog-operator-68c6474976-6wchs\" (UID: \"b6442e1c-e629-43df-b0ad-adf82f384a6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6wchs" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.031842 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6wchs" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.038438 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nfcbn" event={"ID":"d50867fd-81e4-416d-a112-a84b175be026","Type":"ContainerStarted","Data":"ec7bf518be393b69452b3bd54368cea6821110058b35d952d826e23e4d1a4c6e"} Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.050114 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc7jg\" (UniqueName: \"kubernetes.io/projected/6d8ca06e-ea42-4678-8d28-dcd11b4dd1ce-kube-api-access-tc7jg\") pod \"control-plane-machine-set-operator-78cbb6b69f-6mxph\" (UID: \"6d8ca06e-ea42-4678-8d28-dcd11b4dd1ce\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6mxph" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.060283 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssl5j\" (UniqueName: \"kubernetes.io/projected/3d9b017e-f89a-410c-a6f1-7db0bc934b79-kube-api-access-ssl5j\") pod \"ingress-operator-5b745b69d9-g69gg\" (UID: \"3d9b017e-f89a-410c-a6f1-7db0bc934b79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.073587 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xm8pb"] Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.074701 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-rw728"] Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.075451 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzq94\" (UniqueName: \"kubernetes.io/projected/28e6aa1b-b20a-4017-9659-a36c70e1484b-kube-api-access-gzq94\") pod \"service-ca-operator-777779d784-7jh2g\" (UID: \"28e6aa1b-b20a-4017-9659-a36c70e1484b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7jh2g" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.075747 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2e3dd5b8-f86f-48c8-9be1-ab8bf2dfc9ef-node-bootstrap-token\") pod \"machine-config-server-xz7g4\" (UID: \"2e3dd5b8-f86f-48c8-9be1-ab8bf2dfc9ef\") " pod="openshift-machine-config-operator/machine-config-server-xz7g4" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.075819 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2e3dd5b8-f86f-48c8-9be1-ab8bf2dfc9ef-certs\") pod \"machine-config-server-xz7g4\" (UID: \"2e3dd5b8-f86f-48c8-9be1-ab8bf2dfc9ef\") " pod="openshift-machine-config-operator/machine-config-server-xz7g4" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.075888 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh5jh\" (UniqueName: \"kubernetes.io/projected/2e3dd5b8-f86f-48c8-9be1-ab8bf2dfc9ef-kube-api-access-bh5jh\") pod \"machine-config-server-xz7g4\" (UID: \"2e3dd5b8-f86f-48c8-9be1-ab8bf2dfc9ef\") " pod="openshift-machine-config-operator/machine-config-server-xz7g4" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.076094 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/80b740c7-e07f-4ca7-be4e-ef5825f6eb24-socket-dir\") pod \"csi-hostpathplugin-mqxlc\" (UID: \"80b740c7-e07f-4ca7-be4e-ef5825f6eb24\") " pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.076183 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/80b740c7-e07f-4ca7-be4e-ef5825f6eb24-plugins-dir\") pod \"csi-hostpathplugin-mqxlc\" (UID: \"80b740c7-e07f-4ca7-be4e-ef5825f6eb24\") " pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.076253 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28e6aa1b-b20a-4017-9659-a36c70e1484b-serving-cert\") pod \"service-ca-operator-777779d784-7jh2g\" (UID: \"28e6aa1b-b20a-4017-9659-a36c70e1484b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7jh2g" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.076320 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0e7e66ba-3c9e-476f-9bfc-d4f11d95d195-signing-key\") pod \"service-ca-9c57cc56f-n8p2c\" (UID: \"0e7e66ba-3c9e-476f-9bfc-d4f11d95d195\") " pod="openshift-service-ca/service-ca-9c57cc56f-n8p2c" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.076436 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/661fcd49-29e9-4299-8fa7-9696bb5d1944-config-volume\") pod \"collect-profiles-29320650-cqhgz\" (UID: \"661fcd49-29e9-4299-8fa7-9696bb5d1944\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.076531 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e6aa1b-b20a-4017-9659-a36c70e1484b-config\") pod \"service-ca-operator-777779d784-7jh2g\" (UID: \"28e6aa1b-b20a-4017-9659-a36c70e1484b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7jh2g" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.076613 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/661fcd49-29e9-4299-8fa7-9696bb5d1944-secret-volume\") pod \"collect-profiles-29320650-cqhgz\" (UID: \"661fcd49-29e9-4299-8fa7-9696bb5d1944\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.076701 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d8cedf4-9075-4715-9248-53ad5d391a28-apiservice-cert\") pod \"packageserver-d55dfcdfc-hbzpb\" (UID: \"9d8cedf4-9075-4715-9248-53ad5d391a28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.076785 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aff95854-e88b-463e-af90-ed33ab4c24bf-metrics-tls\") pod \"dns-default-bjbnj\" (UID: \"aff95854-e88b-463e-af90-ed33ab4c24bf\") " pod="openshift-dns/dns-default-bjbnj" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.076864 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpsm5\" (UniqueName: \"kubernetes.io/projected/9d8cedf4-9075-4715-9248-53ad5d391a28-kube-api-access-kpsm5\") pod \"packageserver-d55dfcdfc-hbzpb\" (UID: \"9d8cedf4-9075-4715-9248-53ad5d391a28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.076947 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2tsp\" (UniqueName: \"kubernetes.io/projected/6a0172e4-1900-4476-bc0d-b2bf5cd281e5-kube-api-access-n2tsp\") pod \"ingress-canary-plg5s\" (UID: \"6a0172e4-1900-4476-bc0d-b2bf5cd281e5\") " pod="openshift-ingress-canary/ingress-canary-plg5s" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.077020 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0e7e66ba-3c9e-476f-9bfc-d4f11d95d195-signing-cabundle\") pod \"service-ca-9c57cc56f-n8p2c\" (UID: \"0e7e66ba-3c9e-476f-9bfc-d4f11d95d195\") " pod="openshift-service-ca/service-ca-9c57cc56f-n8p2c" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.077102 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/80b740c7-e07f-4ca7-be4e-ef5825f6eb24-registration-dir\") pod \"csi-hostpathplugin-mqxlc\" (UID: \"80b740c7-e07f-4ca7-be4e-ef5825f6eb24\") " pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.077174 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nflz\" (UniqueName: \"kubernetes.io/projected/aff95854-e88b-463e-af90-ed33ab4c24bf-kube-api-access-5nflz\") pod \"dns-default-bjbnj\" (UID: \"aff95854-e88b-463e-af90-ed33ab4c24bf\") " pod="openshift-dns/dns-default-bjbnj" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.077249 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d8cedf4-9075-4715-9248-53ad5d391a28-webhook-cert\") pod \"packageserver-d55dfcdfc-hbzpb\" (UID: \"9d8cedf4-9075-4715-9248-53ad5d391a28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.077514 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2dcp\" (UniqueName: \"kubernetes.io/projected/0e7e66ba-3c9e-476f-9bfc-d4f11d95d195-kube-api-access-d2dcp\") pod \"service-ca-9c57cc56f-n8p2c\" (UID: \"0e7e66ba-3c9e-476f-9bfc-d4f11d95d195\") " pod="openshift-service-ca/service-ca-9c57cc56f-n8p2c" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.099525 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.099789 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/80b740c7-e07f-4ca7-be4e-ef5825f6eb24-csi-data-dir\") pod \"csi-hostpathplugin-mqxlc\" (UID: \"80b740c7-e07f-4ca7-be4e-ef5825f6eb24\") " pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" Sep 30 13:41:26 crc kubenswrapper[4936]: E0930 13:41:26.099940 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:26.599927939 +0000 UTC m=+136.983930240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.099975 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp5vl\" (UniqueName: \"kubernetes.io/projected/661fcd49-29e9-4299-8fa7-9696bb5d1944-kube-api-access-dp5vl\") pod \"collect-profiles-29320650-cqhgz\" (UID: \"661fcd49-29e9-4299-8fa7-9696bb5d1944\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.100019 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rzwn\" (UniqueName: \"kubernetes.io/projected/80b740c7-e07f-4ca7-be4e-ef5825f6eb24-kube-api-access-9rzwn\") pod \"csi-hostpathplugin-mqxlc\" (UID: \"80b740c7-e07f-4ca7-be4e-ef5825f6eb24\") " pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.100059 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a0172e4-1900-4476-bc0d-b2bf5cd281e5-cert\") pod \"ingress-canary-plg5s\" (UID: \"6a0172e4-1900-4476-bc0d-b2bf5cd281e5\") " pod="openshift-ingress-canary/ingress-canary-plg5s" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.100097 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9d8cedf4-9075-4715-9248-53ad5d391a28-tmpfs\") pod \"packageserver-d55dfcdfc-hbzpb\" (UID: \"9d8cedf4-9075-4715-9248-53ad5d391a28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.100121 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aff95854-e88b-463e-af90-ed33ab4c24bf-config-volume\") pod \"dns-default-bjbnj\" (UID: \"aff95854-e88b-463e-af90-ed33ab4c24bf\") " pod="openshift-dns/dns-default-bjbnj" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.100143 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/80b740c7-e07f-4ca7-be4e-ef5825f6eb24-mountpoint-dir\") pod \"csi-hostpathplugin-mqxlc\" (UID: \"80b740c7-e07f-4ca7-be4e-ef5825f6eb24\") " pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.087279 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e6aa1b-b20a-4017-9659-a36c70e1484b-config\") pod \"service-ca-operator-777779d784-7jh2g\" (UID: \"28e6aa1b-b20a-4017-9659-a36c70e1484b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7jh2g" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.099922 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/80b740c7-e07f-4ca7-be4e-ef5825f6eb24-csi-data-dir\") pod \"csi-hostpathplugin-mqxlc\" (UID: \"80b740c7-e07f-4ca7-be4e-ef5825f6eb24\") " pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.087330 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/80b740c7-e07f-4ca7-be4e-ef5825f6eb24-plugins-dir\") pod \"csi-hostpathplugin-mqxlc\" (UID: \"80b740c7-e07f-4ca7-be4e-ef5825f6eb24\") " pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.099207 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2e3dd5b8-f86f-48c8-9be1-ab8bf2dfc9ef-certs\") pod \"machine-config-server-xz7g4\" (UID: \"2e3dd5b8-f86f-48c8-9be1-ab8bf2dfc9ef\") " pod="openshift-machine-config-operator/machine-config-server-xz7g4" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.100629 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9d8cedf4-9075-4715-9248-53ad5d391a28-tmpfs\") pod \"packageserver-d55dfcdfc-hbzpb\" (UID: \"9d8cedf4-9075-4715-9248-53ad5d391a28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.086401 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/80b740c7-e07f-4ca7-be4e-ef5825f6eb24-socket-dir\") pod \"csi-hostpathplugin-mqxlc\" (UID: \"80b740c7-e07f-4ca7-be4e-ef5825f6eb24\") " pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.100684 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/80b740c7-e07f-4ca7-be4e-ef5825f6eb24-mountpoint-dir\") pod \"csi-hostpathplugin-mqxlc\" (UID: \"80b740c7-e07f-4ca7-be4e-ef5825f6eb24\") " pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.099245 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cbz9v" event={"ID":"10b0a93a-9b8f-48d1-bdc2-defe765f1fab","Type":"ContainerStarted","Data":"f0f2c6588f3a2a05b12696f861aa7ba04e113623696f72c0219b7898e5410d23"} Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.089030 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/80b740c7-e07f-4ca7-be4e-ef5825f6eb24-registration-dir\") pod \"csi-hostpathplugin-mqxlc\" (UID: \"80b740c7-e07f-4ca7-be4e-ef5825f6eb24\") " pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.088992 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0e7e66ba-3c9e-476f-9bfc-d4f11d95d195-signing-cabundle\") pod \"service-ca-9c57cc56f-n8p2c\" (UID: \"0e7e66ba-3c9e-476f-9bfc-d4f11d95d195\") " pod="openshift-service-ca/service-ca-9c57cc56f-n8p2c" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.101395 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d8cedf4-9075-4715-9248-53ad5d391a28-apiservice-cert\") pod \"packageserver-d55dfcdfc-hbzpb\" (UID: \"9d8cedf4-9075-4715-9248-53ad5d391a28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.095063 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/661fcd49-29e9-4299-8fa7-9696bb5d1944-config-volume\") pod \"collect-profiles-29320650-cqhgz\" (UID: \"661fcd49-29e9-4299-8fa7-9696bb5d1944\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.102476 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aff95854-e88b-463e-af90-ed33ab4c24bf-config-volume\") pod \"dns-default-bjbnj\" (UID: \"aff95854-e88b-463e-af90-ed33ab4c24bf\") " pod="openshift-dns/dns-default-bjbnj" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.110102 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a0172e4-1900-4476-bc0d-b2bf5cd281e5-cert\") pod \"ingress-canary-plg5s\" (UID: \"6a0172e4-1900-4476-bc0d-b2bf5cd281e5\") " pod="openshift-ingress-canary/ingress-canary-plg5s" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.117224 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-nldp8"] Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.123405 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwgwj\" (UniqueName: \"kubernetes.io/projected/d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e-kube-api-access-fwgwj\") pod \"olm-operator-6b444d44fb-f8s8f\" (UID: \"d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f8s8f" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.124812 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qmnm\" (UniqueName: \"kubernetes.io/projected/fd49f3ee-bb75-4a88-9b5d-207de98a4d0b-kube-api-access-8qmnm\") pod \"etcd-operator-b45778765-fg254\" (UID: \"fd49f3ee-bb75-4a88-9b5d-207de98a4d0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.131019 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0e7e66ba-3c9e-476f-9bfc-d4f11d95d195-signing-key\") pod \"service-ca-9c57cc56f-n8p2c\" (UID: \"0e7e66ba-3c9e-476f-9bfc-d4f11d95d195\") " pod="openshift-service-ca/service-ca-9c57cc56f-n8p2c" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.131803 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2e3dd5b8-f86f-48c8-9be1-ab8bf2dfc9ef-node-bootstrap-token\") pod \"machine-config-server-xz7g4\" (UID: \"2e3dd5b8-f86f-48c8-9be1-ab8bf2dfc9ef\") " pod="openshift-machine-config-operator/machine-config-server-xz7g4" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.132673 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28e6aa1b-b20a-4017-9659-a36c70e1484b-serving-cert\") pod \"service-ca-operator-777779d784-7jh2g\" (UID: \"28e6aa1b-b20a-4017-9659-a36c70e1484b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7jh2g" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.133180 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/661fcd49-29e9-4299-8fa7-9696bb5d1944-secret-volume\") pod \"collect-profiles-29320650-cqhgz\" (UID: \"661fcd49-29e9-4299-8fa7-9696bb5d1944\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.136060 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d8cedf4-9075-4715-9248-53ad5d391a28-webhook-cert\") pod \"packageserver-d55dfcdfc-hbzpb\" (UID: \"9d8cedf4-9075-4715-9248-53ad5d391a28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.136476 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aff95854-e88b-463e-af90-ed33ab4c24bf-metrics-tls\") pod \"dns-default-bjbnj\" (UID: \"aff95854-e88b-463e-af90-ed33ab4c24bf\") " pod="openshift-dns/dns-default-bjbnj" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.136842 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" event={"ID":"836a4387-b928-437f-a758-289ece3ff594","Type":"ContainerStarted","Data":"b05225a9e8ebc128b4f49bc14bac367033217200e5f2fca853d0b8bfae36e678"} Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.136876 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" event={"ID":"836a4387-b928-437f-a758-289ece3ff594","Type":"ContainerStarted","Data":"fdac9b475d4bfa00edf922ecf6613e603d39a8446fe99913f7198d7278067bcc"} Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.138141 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:41:26 crc kubenswrapper[4936]: W0930 13:41:26.138521 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27e3e6c9_3a6f_4f01_8fd6_8801b73f2b30.slice/crio-9ff8e966653fbb6c929714c2ad52458acd7ff3f0b9060c5f3a569e3674e9bae1 WatchSource:0}: Error finding container 9ff8e966653fbb6c929714c2ad52458acd7ff3f0b9060c5f3a569e3674e9bae1: Status 404 returned error can't find the container with id 9ff8e966653fbb6c929714c2ad52458acd7ff3f0b9060c5f3a569e3674e9bae1 Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.139383 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfqvn\" (UniqueName: \"kubernetes.io/projected/28f20487-8292-428a-b17e-2bb85ad6a0d6-kube-api-access-tfqvn\") pod \"machine-config-controller-84d6567774-c49vt\" (UID: \"28f20487-8292-428a-b17e-2bb85ad6a0d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c49vt" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.141411 4936 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-djmjn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.141449 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" podUID="836a4387-b928-437f-a758-289ece3ff594" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.148081 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wd2th"] Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.149011 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54105c99-d0f2-4653-b0b5-e3af3b8a118a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-29b8m\" (UID: \"54105c99-d0f2-4653-b0b5-e3af3b8a118a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-29b8m" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.160247 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" event={"ID":"b58ad99c-bbed-4e80-9cc1-f281c2072fbf","Type":"ContainerStarted","Data":"703f2d9448759efcd75cda5de1fb30a5a14cd0b9689a04ff9298e198d297f077"} Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.175908 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d5kqn" event={"ID":"63461350-882b-40f6-8651-d6273f3e5a2d","Type":"ContainerStarted","Data":"c71cb87e75564c1f8f101247e9a4c231c39b8934330187470191687a4047d844"} Sep 30 13:41:26 crc kubenswrapper[4936]: W0930 13:41:26.180500 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2667a269_9771_4873_8ed1_6781e6aab9bf.slice/crio-65a70b0cc47d37fc041f88e03ca7c5aa28d9311edd1c55400306a29afb93634c WatchSource:0}: Error finding container 65a70b0cc47d37fc041f88e03ca7c5aa28d9311edd1c55400306a29afb93634c: Status 404 returned error can't find the container with id 65a70b0cc47d37fc041f88e03ca7c5aa28d9311edd1c55400306a29afb93634c Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.182238 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" event={"ID":"0c8003ab-2870-41d5-a1c7-30dd4232d184","Type":"ContainerStarted","Data":"e1632bde4e125a1e68737ada0a4eee5143b8a178860442260a56e7d16f91c492"} Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.186149 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a47be60-1d82-451e-a0d4-7dfa3fcf3326-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-cqd7b\" (UID: \"8a47be60-1d82-451e-a0d4-7dfa3fcf3326\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqd7b" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.189093 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvxwl\" (UniqueName: \"kubernetes.io/projected/8bb902ae-877a-46b0-8972-2ea22f50782c-kube-api-access-gvxwl\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.189990 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbjr5" event={"ID":"50b3ddf6-dac3-4108-a740-978dcd73ef6a","Type":"ContainerStarted","Data":"7220cd659430d48e5eef2401c1116a94e4987b7f227c8d5f793fd3c88904ee58"} Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.190027 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbjr5" event={"ID":"50b3ddf6-dac3-4108-a740-978dcd73ef6a","Type":"ContainerStarted","Data":"6c81da632ff1fef531afa4ed89d56f875782cda0d85ffcc0d23511db55d46bbf"} Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.199955 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqd7b" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.200660 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:26 crc kubenswrapper[4936]: E0930 13:41:26.200842 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:26.700826982 +0000 UTC m=+137.084829283 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.200912 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:26 crc kubenswrapper[4936]: E0930 13:41:26.201189 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:26.701182823 +0000 UTC m=+137.085185114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.205849 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jl85m" event={"ID":"c7e5e231-b700-4151-81c8-111a3af3bfc2","Type":"ContainerStarted","Data":"d1d7ee47047dc1e771247bd0d411a195500813cfad5616777f316a668721ab3b"} Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.205974 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jl85m" event={"ID":"c7e5e231-b700-4151-81c8-111a3af3bfc2","Type":"ContainerStarted","Data":"a4068dfc77dfcc96ad8374b6c6fefb3e7dc3264c1810da086406c939573573dc"} Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.206324 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mpz7"] Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.208593 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8bb902ae-877a-46b0-8972-2ea22f50782c-bound-sa-token\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.220094 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ff5368c-b6b0-4e04-83db-314de1999000-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-75ftv\" (UID: \"6ff5368c-b6b0-4e04-83db-314de1999000\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75ftv" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.225637 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.241411 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-2nljs"] Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.255438 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-sht9l" event={"ID":"2ea3912a-d1e4-4c08-81d1-3788ce6096e6","Type":"ContainerStarted","Data":"eec6d7d68b207f701685730e63e34f86891af561fd6a91b0fc9d352b1af4c4fc"} Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.256182 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75ftv" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.262862 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcwdd\" (UniqueName: \"kubernetes.io/projected/a231f851-06a5-4424-8631-04c71f2ccf7a-kube-api-access-hcwdd\") pod \"package-server-manager-789f6589d5-v22qc\" (UID: \"a231f851-06a5-4424-8631-04c71f2ccf7a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v22qc" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.263539 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hj57l" event={"ID":"a641ed11-580d-41ac-967c-e145d80b03fa","Type":"ContainerStarted","Data":"a2830146cdb503e33d90a62d423e0aa6426958233820170695da7701f4e2129b"} Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.268645 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-29b8m" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.275890 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c49vt" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.282627 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f8s8f" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.296120 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d9b017e-f89a-410c-a6f1-7db0bc934b79-bound-sa-token\") pod \"ingress-operator-5b745b69d9-g69gg\" (UID: \"3d9b017e-f89a-410c-a6f1-7db0bc934b79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.301481 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:26 crc kubenswrapper[4936]: E0930 13:41:26.302551 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:26.802536239 +0000 UTC m=+137.186538540 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.303986 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqtxh\" (UniqueName: \"kubernetes.io/projected/29789917-34b7-4bc9-8b49-a00b5055f092-kube-api-access-wqtxh\") pod \"machine-config-operator-74547568cd-bdtws\" (UID: \"29789917-34b7-4bc9-8b49-a00b5055f092\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.307062 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6mxph" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.313589 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.320282 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.322161 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzq94\" (UniqueName: \"kubernetes.io/projected/28e6aa1b-b20a-4017-9659-a36c70e1484b-kube-api-access-gzq94\") pod \"service-ca-operator-777779d784-7jh2g\" (UID: \"28e6aa1b-b20a-4017-9659-a36c70e1484b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7jh2g" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.342808 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2tsp\" (UniqueName: \"kubernetes.io/projected/6a0172e4-1900-4476-bc0d-b2bf5cd281e5-kube-api-access-n2tsp\") pod \"ingress-canary-plg5s\" (UID: \"6a0172e4-1900-4476-bc0d-b2bf5cd281e5\") " pod="openshift-ingress-canary/ingress-canary-plg5s" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.343269 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7jh2g" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.350644 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v22qc" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.355106 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-njdd2"] Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.357871 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2dcp\" (UniqueName: \"kubernetes.io/projected/0e7e66ba-3c9e-476f-9bfc-d4f11d95d195-kube-api-access-d2dcp\") pod \"service-ca-9c57cc56f-n8p2c\" (UID: \"0e7e66ba-3c9e-476f-9bfc-d4f11d95d195\") " pod="openshift-service-ca/service-ca-9c57cc56f-n8p2c" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.373620 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-plg5s" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.384969 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nflz\" (UniqueName: \"kubernetes.io/projected/aff95854-e88b-463e-af90-ed33ab4c24bf-kube-api-access-5nflz\") pod \"dns-default-bjbnj\" (UID: \"aff95854-e88b-463e-af90-ed33ab4c24bf\") " pod="openshift-dns/dns-default-bjbnj" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.403594 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:26 crc kubenswrapper[4936]: E0930 13:41:26.403845 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:26.903834474 +0000 UTC m=+137.287836775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.412058 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh5jh\" (UniqueName: \"kubernetes.io/projected/2e3dd5b8-f86f-48c8-9be1-ab8bf2dfc9ef-kube-api-access-bh5jh\") pod \"machine-config-server-xz7g4\" (UID: \"2e3dd5b8-f86f-48c8-9be1-ab8bf2dfc9ef\") " pod="openshift-machine-config-operator/machine-config-server-xz7g4" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.417609 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bjbnj" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.437833 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpsm5\" (UniqueName: \"kubernetes.io/projected/9d8cedf4-9075-4715-9248-53ad5d391a28-kube-api-access-kpsm5\") pod \"packageserver-d55dfcdfc-hbzpb\" (UID: \"9d8cedf4-9075-4715-9248-53ad5d391a28\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.441104 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6rwjr"] Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.452692 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp5vl\" (UniqueName: \"kubernetes.io/projected/661fcd49-29e9-4299-8fa7-9696bb5d1944-kube-api-access-dp5vl\") pod \"collect-profiles-29320650-cqhgz\" (UID: \"661fcd49-29e9-4299-8fa7-9696bb5d1944\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.484216 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7ccwq"] Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.515914 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rzwn\" (UniqueName: \"kubernetes.io/projected/80b740c7-e07f-4ca7-be4e-ef5825f6eb24-kube-api-access-9rzwn\") pod \"csi-hostpathplugin-mqxlc\" (UID: \"80b740c7-e07f-4ca7-be4e-ef5825f6eb24\") " pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.516784 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:26 crc kubenswrapper[4936]: E0930 13:41:26.517023 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:27.017011868 +0000 UTC m=+137.401014169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.562157 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.590234 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f"] Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.624158 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:26 crc kubenswrapper[4936]: E0930 13:41:26.624446 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:27.124434432 +0000 UTC m=+137.508436733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.639644 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.639848 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.642634 4936 patch_prober.go:28] interesting pod/router-default-5444994796-sht9l container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.642850 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sht9l" podUID="2ea3912a-d1e4-4c08-81d1-3788ce6096e6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.658106 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-n8p2c" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.665001 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.675995 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6wchs"] Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.681586 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-xz7g4" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.707721 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.724776 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:26 crc kubenswrapper[4936]: E0930 13:41:26.726211 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:27.226193911 +0000 UTC m=+137.610196212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:26 crc kubenswrapper[4936]: W0930 13:41:26.808638 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6442e1c_e629_43df_b0ad_adf82f384a6a.slice/crio-8e14d85da7701e07c114a8e98eb4272ff790112f473cdf8d203cf08ea4c1bb2e WatchSource:0}: Error finding container 8e14d85da7701e07c114a8e98eb4272ff790112f473cdf8d203cf08ea4c1bb2e: Status 404 returned error can't find the container with id 8e14d85da7701e07c114a8e98eb4272ff790112f473cdf8d203cf08ea4c1bb2e Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.848664 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:26 crc kubenswrapper[4936]: E0930 13:41:26.849030 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:27.349017488 +0000 UTC m=+137.733019789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:26 crc kubenswrapper[4936]: I0930 13:41:26.951614 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:26 crc kubenswrapper[4936]: E0930 13:41:26.952222 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:27.452206308 +0000 UTC m=+137.836208609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.053771 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:27 crc kubenswrapper[4936]: E0930 13:41:27.054277 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:27.554262055 +0000 UTC m=+137.938264356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.148978 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws"] Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.154934 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:27 crc kubenswrapper[4936]: E0930 13:41:27.155163 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:27.655148488 +0000 UTC m=+138.039150789 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.183980 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6mxph"] Sep 30 13:41:27 crc kubenswrapper[4936]: W0930 13:41:27.250358 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e3dd5b8_f86f_48c8_9be1_ab8bf2dfc9ef.slice/crio-59b8dc67839a45a4a3c1e3aa24b1fc66980429a5c14e31a9a8c6edc1d28cc948 WatchSource:0}: Error finding container 59b8dc67839a45a4a3c1e3aa24b1fc66980429a5c14e31a9a8c6edc1d28cc948: Status 404 returned error can't find the container with id 59b8dc67839a45a4a3c1e3aa24b1fc66980429a5c14e31a9a8c6edc1d28cc948 Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.255981 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:27 crc kubenswrapper[4936]: E0930 13:41:27.256268 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:27.756256436 +0000 UTC m=+138.140258727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.289110 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-jl85m" podStartSLOduration=117.28909493 podStartE2EDuration="1m57.28909493s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:27.285840265 +0000 UTC m=+137.669842566" watchObservedRunningTime="2025-09-30 13:41:27.28909493 +0000 UTC m=+137.673097221" Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.289872 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fg254"] Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.346642 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" event={"ID":"1664a2e3-61da-4eea-bdba-06b422cbb9b6","Type":"ContainerStarted","Data":"2645fe0680a18140b07bc1a7799402679f4077965094948809af312808f7aff8"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.359610 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:27 crc kubenswrapper[4936]: E0930 13:41:27.359837 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:27.859812367 +0000 UTC m=+138.243814668 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.360128 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:27 crc kubenswrapper[4936]: E0930 13:41:27.360512 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:27.860505367 +0000 UTC m=+138.244507668 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.366623 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-sht9l" event={"ID":"2ea3912a-d1e4-4c08-81d1-3788ce6096e6","Type":"ContainerStarted","Data":"c389da011eee61294e2b6ab7770fe38c1f925903c62b66f9dc57fe64e320efdd"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.439404 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" podStartSLOduration=117.439385834 podStartE2EDuration="1m57.439385834s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:27.406514308 +0000 UTC m=+137.790516619" watchObservedRunningTime="2025-09-30 13:41:27.439385834 +0000 UTC m=+137.823388135" Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.461305 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:27 crc kubenswrapper[4936]: E0930 13:41:27.461716 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:27.961700679 +0000 UTC m=+138.345702980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.521897 4936 generic.go:334] "Generic (PLEG): container finished" podID="a641ed11-580d-41ac-967c-e145d80b03fa" containerID="ec79a5bffdc82100fd9a1075f23af0d4cf8d6332dba01bde8ef2e3af70e79244" exitCode=0 Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.522255 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hj57l" event={"ID":"a641ed11-580d-41ac-967c-e145d80b03fa","Type":"ContainerDied","Data":"ec79a5bffdc82100fd9a1075f23af0d4cf8d6332dba01bde8ef2e3af70e79244"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.538907 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqd7b"] Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.547696 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mpz7" event={"ID":"33d9470f-af56-43d0-9d14-90930c95615a","Type":"ContainerStarted","Data":"1295657ffc8cca3ca464fb07e8fe0e11545a9d6cf00fed1941562b4fc358a6af"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.592168 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.595254 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wd2th" event={"ID":"9881cb2e-950d-4550-b04a-8254d9581cd1","Type":"ContainerStarted","Data":"1e6782d69039745553e65c14b5653846d231c46feea28d3fb9c53d64448b033f"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.598895 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wd2th" event={"ID":"9881cb2e-950d-4550-b04a-8254d9581cd1","Type":"ContainerStarted","Data":"fee0fe8a7252b21af47f0dacc8898a3453b9395689e06898e9bae31dd359a5bd"} Sep 30 13:41:27 crc kubenswrapper[4936]: E0930 13:41:27.597273 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:28.09725331 +0000 UTC m=+138.481255611 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.609520 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v22qc"] Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.618822 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f8s8f"] Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.627800 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-29b8m"] Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.631743 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6wchs" event={"ID":"b6442e1c-e629-43df-b0ad-adf82f384a6a","Type":"ContainerStarted","Data":"8e14d85da7701e07c114a8e98eb4272ff790112f473cdf8d203cf08ea4c1bb2e"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.638422 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-c49vt"] Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.664476 4936 patch_prober.go:28] interesting pod/router-default-5444994796-sht9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:41:27 crc kubenswrapper[4936]: [-]has-synced failed: reason withheld Sep 30 13:41:27 crc kubenswrapper[4936]: [+]process-running ok Sep 30 13:41:27 crc kubenswrapper[4936]: healthz check failed Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.664516 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sht9l" podUID="2ea3912a-d1e4-4c08-81d1-3788ce6096e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.671894 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75ftv"] Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.676745 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7jxz9"] Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.694155 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-rw728" event={"ID":"27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30","Type":"ContainerStarted","Data":"4216f1bb84d7a28cd4674baf7d17a976186ed86ec2f56f446667afec38dde9ff"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.694197 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-rw728" event={"ID":"27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30","Type":"ContainerStarted","Data":"9ff8e966653fbb6c929714c2ad52458acd7ff3f0b9060c5f3a569e3674e9bae1"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.701574 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:27 crc kubenswrapper[4936]: E0930 13:41:27.701912 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:28.201896713 +0000 UTC m=+138.585899014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.703754 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xm8pb" event={"ID":"eb040b17-af49-45f7-9405-2f75256dbbe4","Type":"ContainerStarted","Data":"53b24b447418514d9b38d1387cc76665494f52b324c28afa6815390ac56190bf"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.703789 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xm8pb" event={"ID":"eb040b17-af49-45f7-9405-2f75256dbbe4","Type":"ContainerStarted","Data":"8492a7eb53f0a4e9d497384700f2501fe85663079c00b4103e2528453061c578"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.726015 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gc5z6" event={"ID":"2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474","Type":"ContainerStarted","Data":"a24b3c4a04fd0087e184dc52656be4bc387af267fc604358c527bac075d6817b"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.726053 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gc5z6" event={"ID":"2cf98c2b-d04d-4bb7-bf3b-15bdf2c0e474","Type":"ContainerStarted","Data":"2321c2c1d71fee7488630e2344419ccb83fa881a34d54a50d66c80d06194c7e5"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.742613 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" event={"ID":"7cd84bc0-8b2a-45a0-bda3-e1cad2d795df","Type":"ContainerStarted","Data":"7c7012cc0dd08ad9666a1908852b2e10c6919527a37c6e85f6ba25368d3eeb1a"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.744218 4936 generic.go:334] "Generic (PLEG): container finished" podID="0c8003ab-2870-41d5-a1c7-30dd4232d184" containerID="8782bb9d6931e514f48e9db3fc3457d94aa029de62cdfdbe17f131a7b9cc52ba" exitCode=0 Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.744414 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" event={"ID":"0c8003ab-2870-41d5-a1c7-30dd4232d184","Type":"ContainerDied","Data":"8782bb9d6931e514f48e9db3fc3457d94aa029de62cdfdbe17f131a7b9cc52ba"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.777046 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbjr5" event={"ID":"50b3ddf6-dac3-4108-a740-978dcd73ef6a","Type":"ContainerStarted","Data":"0ecd2a779cf7541bf9fcf98129b29e5719e42f82c1634f1c677177b6e41f9f2e"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.788191 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-njdd2" event={"ID":"e16bbb17-ee43-4b34-885f-0e042fcde913","Type":"ContainerStarted","Data":"7c856fc640e0c5221ca2053cd5d5de1d6b66b2eb80e114bc3453b68557b7f1bc"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.789100 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-nldp8" event={"ID":"2667a269-9771-4873-8ed1-6781e6aab9bf","Type":"ContainerStarted","Data":"65a70b0cc47d37fc041f88e03ca7c5aa28d9311edd1c55400306a29afb93634c"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.789844 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-xz7g4" event={"ID":"2e3dd5b8-f86f-48c8-9be1-ab8bf2dfc9ef","Type":"ContainerStarted","Data":"59b8dc67839a45a4a3c1e3aa24b1fc66980429a5c14e31a9a8c6edc1d28cc948"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.800693 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-plg5s"] Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.809841 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cbz9v" event={"ID":"10b0a93a-9b8f-48d1-bdc2-defe765f1fab","Type":"ContainerStarted","Data":"70a2488e25161402e14a1d7361ceb3d7e9bfc158e34a49244bfa96b76f16fb42"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.810900 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:27 crc kubenswrapper[4936]: E0930 13:41:27.811233 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:28.311221523 +0000 UTC m=+138.695223824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.849351 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d5kqn" event={"ID":"63461350-882b-40f6-8651-d6273f3e5a2d","Type":"ContainerStarted","Data":"65397e110eea089fdaabbcc7e5a6b27d505e968c9792226b62c0a2fd67c30c4c"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.873072 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" event={"ID":"b58ad99c-bbed-4e80-9cc1-f281c2072fbf","Type":"ContainerStarted","Data":"45787c0543b2f23cbabb613351aead176bf92f29aea11f0b96e76e9661dbc522"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.874149 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.905909 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nfcbn" event={"ID":"d50867fd-81e4-416d-a112-a84b175be026","Type":"ContainerStarted","Data":"9327633c09b52a71ffc20d11dcd9113e68afa7b4302c14d85626b090207cf066"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.906724 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-nfcbn" Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.911986 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:27 crc kubenswrapper[4936]: E0930 13:41:27.912191 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:28.412140757 +0000 UTC m=+138.796143058 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.912376 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:27 crc kubenswrapper[4936]: E0930 13:41:27.917111 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:28.417095043 +0000 UTC m=+138.801097344 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.918348 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-sht9l" podStartSLOduration=117.918318618 podStartE2EDuration="1m57.918318618s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:27.91770059 +0000 UTC m=+138.301702891" watchObservedRunningTime="2025-09-30 13:41:27.918318618 +0000 UTC m=+138.302320919" Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.923498 4936 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfcbn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.923545 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nfcbn" podUID="d50867fd-81e4-416d-a112-a84b175be026" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.933586 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7jh2g"] Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.977106 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6rwjr" event={"ID":"c2ccd121-a5fc-4f11-b256-70b80420eb21","Type":"ContainerStarted","Data":"741ea0c2314bbd37d322c8a81b0fde35c4534f4034186c8549c4dc846c25114a"} Sep 30 13:41:27 crc kubenswrapper[4936]: I0930 13:41:27.991726 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bjbnj"] Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:27.999870 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws" event={"ID":"29789917-34b7-4bc9-8b49-a00b5055f092","Type":"ContainerStarted","Data":"6456949c3abd7da574daf9de1744ac019643b71aa2b7c1951deec706c5fe59ba"} Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.013191 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:28 crc kubenswrapper[4936]: E0930 13:41:28.013383 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:28.51336819 +0000 UTC m=+138.897370491 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.013779 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:28 crc kubenswrapper[4936]: E0930 13:41:28.014956 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:28.514929786 +0000 UTC m=+138.898932087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.027211 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" event={"ID":"06045f3a-af69-49c7-9759-915cd9fb4c65","Type":"ContainerStarted","Data":"4a2a1c7048ba45b8b9179f95dee2de30ea3cfb74c638d2bcb88b740b46f15575"} Sep 30 13:41:28 crc kubenswrapper[4936]: W0930 13:41:28.046437 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28e6aa1b_b20a_4017_9659_a36c70e1484b.slice/crio-b4a03fc578470554c4f6ee6b6913bba5d12d66c33fbb2de741459a0b73c7e5d0 WatchSource:0}: Error finding container b4a03fc578470554c4f6ee6b6913bba5d12d66c33fbb2de741459a0b73c7e5d0: Status 404 returned error can't find the container with id b4a03fc578470554c4f6ee6b6913bba5d12d66c33fbb2de741459a0b73c7e5d0 Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.087849 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.114965 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:28 crc kubenswrapper[4936]: E0930 13:41:28.115173 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:28.615150939 +0000 UTC m=+138.999153240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.115419 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:28 crc kubenswrapper[4936]: E0930 13:41:28.115984 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:28.615775137 +0000 UTC m=+138.999777428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.119746 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" podStartSLOduration=118.119729303 podStartE2EDuration="1m58.119729303s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:28.086965201 +0000 UTC m=+138.470967502" watchObservedRunningTime="2025-09-30 13:41:28.119729303 +0000 UTC m=+138.503731604" Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.171436 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gc5z6" podStartSLOduration=118.171411921 podStartE2EDuration="1m58.171411921s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:28.169204226 +0000 UTC m=+138.553206527" watchObservedRunningTime="2025-09-30 13:41:28.171411921 +0000 UTC m=+138.555414222" Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.182019 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.217024 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:28 crc kubenswrapper[4936]: E0930 13:41:28.217918 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:28.717905426 +0000 UTC m=+139.101907717 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.219936 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rbjr5" podStartSLOduration=119.219918136 podStartE2EDuration="1m59.219918136s" podCreationTimestamp="2025-09-30 13:39:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:28.203452812 +0000 UTC m=+138.587455123" watchObservedRunningTime="2025-09-30 13:41:28.219918136 +0000 UTC m=+138.603920437" Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.238296 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg"] Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.295488 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz"] Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.322983 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:28 crc kubenswrapper[4936]: E0930 13:41:28.323257 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:28.82324582 +0000 UTC m=+139.207248121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.423981 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:28 crc kubenswrapper[4936]: E0930 13:41:28.424195 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:28.924173874 +0000 UTC m=+139.308176175 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.424542 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:28 crc kubenswrapper[4936]: E0930 13:41:28.424849 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:28.924837803 +0000 UTC m=+139.308840104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.444043 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cbz9v" podStartSLOduration=118.444015287 podStartE2EDuration="1m58.444015287s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:28.381024947 +0000 UTC m=+138.765027248" watchObservedRunningTime="2025-09-30 13:41:28.444015287 +0000 UTC m=+138.828017588" Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.502794 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xm8pb" podStartSLOduration=119.502775992 podStartE2EDuration="1m59.502775992s" podCreationTimestamp="2025-09-30 13:39:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:28.444910633 +0000 UTC m=+138.828912934" watchObservedRunningTime="2025-09-30 13:41:28.502775992 +0000 UTC m=+138.886778283" Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.504353 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-n8p2c"] Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.525227 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:28 crc kubenswrapper[4936]: E0930 13:41:28.525606 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:29.025579542 +0000 UTC m=+139.409581843 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.547924 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mqxlc"] Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.550887 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb"] Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.560938 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-nfcbn" podStartSLOduration=118.56091456 podStartE2EDuration="1m58.56091456s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:28.541161659 +0000 UTC m=+138.925163970" watchObservedRunningTime="2025-09-30 13:41:28.56091456 +0000 UTC m=+138.944916861" Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.627351 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:28 crc kubenswrapper[4936]: E0930 13:41:28.627675 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:29.12765922 +0000 UTC m=+139.511661521 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.644214 4936 patch_prober.go:28] interesting pod/router-default-5444994796-sht9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:41:28 crc kubenswrapper[4936]: [-]has-synced failed: reason withheld Sep 30 13:41:28 crc kubenswrapper[4936]: [+]process-running ok Sep 30 13:41:28 crc kubenswrapper[4936]: healthz check failed Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.644262 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sht9l" podUID="2ea3912a-d1e4-4c08-81d1-3788ce6096e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:41:28 crc kubenswrapper[4936]: W0930 13:41:28.679086 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d8cedf4_9075_4715_9248_53ad5d391a28.slice/crio-a6aa227873b68b4ae7b085a1349ec74ad7db18cb459607b20b6fe6e58130f437 WatchSource:0}: Error finding container a6aa227873b68b4ae7b085a1349ec74ad7db18cb459607b20b6fe6e58130f437: Status 404 returned error can't find the container with id a6aa227873b68b4ae7b085a1349ec74ad7db18cb459607b20b6fe6e58130f437 Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.727885 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:28 crc kubenswrapper[4936]: E0930 13:41:28.728205 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:29.228185482 +0000 UTC m=+139.612187783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.830168 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:28 crc kubenswrapper[4936]: E0930 13:41:28.830506 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:29.330495066 +0000 UTC m=+139.714497367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.930808 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:28 crc kubenswrapper[4936]: E0930 13:41:28.931069 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:29.431040739 +0000 UTC m=+139.815043040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:28 crc kubenswrapper[4936]: I0930 13:41:28.931315 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:28 crc kubenswrapper[4936]: E0930 13:41:28.931641 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:29.431628536 +0000 UTC m=+139.815630837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.040578 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:29 crc kubenswrapper[4936]: E0930 13:41:29.041046 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:29.541030489 +0000 UTC m=+139.925032790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.095544 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-29b8m" event={"ID":"54105c99-d0f2-4653-b0b5-e3af3b8a118a","Type":"ContainerStarted","Data":"7d5a989b019976080364e971e148fe6d514e544ad828121de5296807df95ebcf"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.126930 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqd7b" event={"ID":"8a47be60-1d82-451e-a0d4-7dfa3fcf3326","Type":"ContainerStarted","Data":"e332e62a2ccef7667dfdebc0c045eba6dd101789a8c97762357bc2f4a50de2d4"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.126991 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqd7b" event={"ID":"8a47be60-1d82-451e-a0d4-7dfa3fcf3326","Type":"ContainerStarted","Data":"54905316bb8655391303256c2162ea035a7baf053d7ac8783bc592063f4c8964"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.136299 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-plg5s" event={"ID":"6a0172e4-1900-4476-bc0d-b2bf5cd281e5","Type":"ContainerStarted","Data":"81ad5c97b9d2603b38248cac004f13d4eee0d93f8ba660d8146f28912a6a119f"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.144726 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:29 crc kubenswrapper[4936]: E0930 13:41:29.145882 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:29.645866848 +0000 UTC m=+140.029869149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.157911 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c49vt" event={"ID":"28f20487-8292-428a-b17e-2bb85ad6a0d6","Type":"ContainerStarted","Data":"ebcf19ed356d36d5c56b0ddcfe4e3295a82379fcc4f0b4a91cc0a12553f8fd9e"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.171726 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-cqd7b" podStartSLOduration=119.171705426 podStartE2EDuration="1m59.171705426s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:29.168740459 +0000 UTC m=+139.552742760" watchObservedRunningTime="2025-09-30 13:41:29.171705426 +0000 UTC m=+139.555707727" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.247452 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:29 crc kubenswrapper[4936]: E0930 13:41:29.248355 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:29.748321116 +0000 UTC m=+140.132323417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.248610 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:29 crc kubenswrapper[4936]: E0930 13:41:29.249021 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:29.749007857 +0000 UTC m=+140.133010158 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.261368 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" event={"ID":"7cd84bc0-8b2a-45a0-bda3-e1cad2d795df","Type":"ContainerStarted","Data":"6fb35db9032e66b8986807bfb8ddfd1f9b888368ff2d41ee500a7a7e9d6addd8"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.296987 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-2nljs" podStartSLOduration=120.296974885 podStartE2EDuration="2m0.296974885s" podCreationTimestamp="2025-09-30 13:39:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:29.295902434 +0000 UTC m=+139.679904735" watchObservedRunningTime="2025-09-30 13:41:29.296974885 +0000 UTC m=+139.680977186" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.350603 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wd2th" event={"ID":"9881cb2e-950d-4550-b04a-8254d9581cd1","Type":"ContainerStarted","Data":"72b163fbfe27135eae372a59888dbd90affe1d7c96689aec13b1d39cac41d17a"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.354109 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:29 crc kubenswrapper[4936]: E0930 13:41:29.358695 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:29.858671977 +0000 UTC m=+140.242674278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.388945 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wd2th" podStartSLOduration=119.388928656 podStartE2EDuration="1m59.388928656s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:29.374290006 +0000 UTC m=+139.758292307" watchObservedRunningTime="2025-09-30 13:41:29.388928656 +0000 UTC m=+139.772930957" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.422310 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6mxph" event={"ID":"6d8ca06e-ea42-4678-8d28-dcd11b4dd1ce","Type":"ContainerStarted","Data":"6d6c39ed4ee5c5b790d5588c968209f0f89e45074d558b203943e4ca15c93042"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.422375 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6mxph" event={"ID":"6d8ca06e-ea42-4678-8d28-dcd11b4dd1ce","Type":"ContainerStarted","Data":"9145be137d3b05585e8609f6dcaf94dfd6d16ead4224ea6fbf20438d8373cbab"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.465697 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:29 crc kubenswrapper[4936]: E0930 13:41:29.467407 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:29.96739326 +0000 UTC m=+140.351395551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.495079 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" event={"ID":"0c8003ab-2870-41d5-a1c7-30dd4232d184","Type":"ContainerStarted","Data":"1a113cb57772384cacf24f6a88924dd28c88fe391a411d18ce69bd6c1bc50e4d"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.495968 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.533743 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" podStartSLOduration=119.533727208 podStartE2EDuration="1m59.533727208s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:29.532826971 +0000 UTC m=+139.916829282" watchObservedRunningTime="2025-09-30 13:41:29.533727208 +0000 UTC m=+139.917729509" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.535234 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6mxph" podStartSLOduration=119.535228132 podStartE2EDuration="1m59.535228132s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:29.456597343 +0000 UTC m=+139.840599644" watchObservedRunningTime="2025-09-30 13:41:29.535228132 +0000 UTC m=+139.919230433" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.567070 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.570755 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-xz7g4" event={"ID":"2e3dd5b8-f86f-48c8-9be1-ab8bf2dfc9ef","Type":"ContainerStarted","Data":"2825168fc88508158b346d5fa8fc627449f67cf77040f0ee56f2eb7cddada464"} Sep 30 13:41:29 crc kubenswrapper[4936]: E0930 13:41:29.571817 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:30.071798736 +0000 UTC m=+140.455801027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.600294 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-xz7g4" podStartSLOduration=6.600276342 podStartE2EDuration="6.600276342s" podCreationTimestamp="2025-09-30 13:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:29.598774928 +0000 UTC m=+139.982777239" watchObservedRunningTime="2025-09-30 13:41:29.600276342 +0000 UTC m=+139.984278643" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.614143 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-rw728" event={"ID":"27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30","Type":"ContainerStarted","Data":"c17185a5f8c94c77a1c7bad8a81751d1e0cf6fa16a9cc7a7c8f4b258b6aab45d"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.645488 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d5kqn" event={"ID":"63461350-882b-40f6-8651-d6273f3e5a2d","Type":"ContainerStarted","Data":"d64e30e499a9ca2e74ab4b951a070c2d9ab36e65c946bb68b449a94dc6f20435"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.659061 4936 patch_prober.go:28] interesting pod/router-default-5444994796-sht9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:41:29 crc kubenswrapper[4936]: [-]has-synced failed: reason withheld Sep 30 13:41:29 crc kubenswrapper[4936]: [+]process-running ok Sep 30 13:41:29 crc kubenswrapper[4936]: healthz check failed Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.659098 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sht9l" podUID="2ea3912a-d1e4-4c08-81d1-3788ce6096e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.662385 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" event={"ID":"9d8cedf4-9075-4715-9248-53ad5d391a28","Type":"ContainerStarted","Data":"a6aa227873b68b4ae7b085a1349ec74ad7db18cb459607b20b6fe6e58130f437"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.662877 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.663722 4936 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-hbzpb container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.663768 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" podUID="9d8cedf4-9075-4715-9248-53ad5d391a28" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.672504 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:29 crc kubenswrapper[4936]: E0930 13:41:29.672867 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:30.172854504 +0000 UTC m=+140.556856805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.681687 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz" event={"ID":"661fcd49-29e9-4299-8fa7-9696bb5d1944","Type":"ContainerStarted","Data":"895a3deb2b4f844a017c4721794d0283798f2f3c0f595ed6afba58ee9201bf9b"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.682740 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-rw728" podStartSLOduration=119.682722083 podStartE2EDuration="1m59.682722083s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:29.65366423 +0000 UTC m=+140.037666531" watchObservedRunningTime="2025-09-30 13:41:29.682722083 +0000 UTC m=+140.066724384" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.683312 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-d5kqn" podStartSLOduration=119.683306751 podStartE2EDuration="1m59.683306751s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:29.682140376 +0000 UTC m=+140.066142677" watchObservedRunningTime="2025-09-30 13:41:29.683306751 +0000 UTC m=+140.067309042" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.718484 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mpz7" event={"ID":"33d9470f-af56-43d0-9d14-90930c95615a","Type":"ContainerStarted","Data":"e2732966e725c35cb102e19db25e94baa5e569bde724b5a0e2fc3a949e3c0046"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.741535 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6rwjr" event={"ID":"c2ccd121-a5fc-4f11-b256-70b80420eb21","Type":"ContainerStarted","Data":"a3f956ea9293d8db47f705f13776ceaadf048c8c892b5f16cd653688ac6eae56"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.742250 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-6rwjr" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.751764 4936 patch_prober.go:28] interesting pod/console-operator-58897d9998-6rwjr container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.752121 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-6rwjr" podUID="c2ccd121-a5fc-4f11-b256-70b80420eb21" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.753115 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" podStartSLOduration=119.75309492 podStartE2EDuration="1m59.75309492s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:29.752712219 +0000 UTC m=+140.136714520" watchObservedRunningTime="2025-09-30 13:41:29.75309492 +0000 UTC m=+140.137097211" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.771561 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg" event={"ID":"3d9b017e-f89a-410c-a6f1-7db0bc934b79","Type":"ContainerStarted","Data":"cc398c529d9c1c4ea7ef7ff2b382a0989dff2c2ac2f9f61b0770bc0d1cefc214"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.777944 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:29 crc kubenswrapper[4936]: E0930 13:41:29.779022 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:30.279007171 +0000 UTC m=+140.663009472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.781290 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7jh2g" event={"ID":"28e6aa1b-b20a-4017-9659-a36c70e1484b","Type":"ContainerStarted","Data":"b4a03fc578470554c4f6ee6b6913bba5d12d66c33fbb2de741459a0b73c7e5d0"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.798969 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v22qc" event={"ID":"a231f851-06a5-4424-8631-04c71f2ccf7a","Type":"ContainerStarted","Data":"488330f6a1c8a9e9088833bcc8549cbbe7bd3a17ddfc09749bfa33c983cfa445"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.799006 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v22qc" event={"ID":"a231f851-06a5-4424-8631-04c71f2ccf7a","Type":"ContainerStarted","Data":"90935b186b17a6d5ba2a2d6906f2aea7d26d158a9a8d418fe3a647908f1b635c"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.799265 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mpz7" podStartSLOduration=119.799256776 podStartE2EDuration="1m59.799256776s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:29.798642368 +0000 UTC m=+140.182644669" watchObservedRunningTime="2025-09-30 13:41:29.799256776 +0000 UTC m=+140.183259077" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.807116 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" event={"ID":"fd49f3ee-bb75-4a88-9b5d-207de98a4d0b","Type":"ContainerStarted","Data":"f90aefb374bcf2f79307347f000977eae3af57bc4d64ac53b52f9cb82ce9698b"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.816523 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-n8p2c" event={"ID":"0e7e66ba-3c9e-476f-9bfc-d4f11d95d195","Type":"ContainerStarted","Data":"cf251736e70096ca635b3d6fde519d04829e350a0f5e7d4c55acff0b988d0881"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.827289 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-6rwjr" podStartSLOduration=119.827271518 podStartE2EDuration="1m59.827271518s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:29.826065653 +0000 UTC m=+140.210067954" watchObservedRunningTime="2025-09-30 13:41:29.827271518 +0000 UTC m=+140.211273819" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.827310 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws" event={"ID":"29789917-34b7-4bc9-8b49-a00b5055f092","Type":"ContainerStarted","Data":"21d6606cc97952d0dcf2d04e30d8ebe3713ade731c8f09b571485952bcb02861"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.837526 4936 generic.go:334] "Generic (PLEG): container finished" podID="1664a2e3-61da-4eea-bdba-06b422cbb9b6" containerID="ac7c43b2fea3819ddbb0bed574875f85d84fd382fd490788ce3149592c651196" exitCode=0 Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.837778 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" event={"ID":"1664a2e3-61da-4eea-bdba-06b422cbb9b6","Type":"ContainerDied","Data":"ac7c43b2fea3819ddbb0bed574875f85d84fd382fd490788ce3149592c651196"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.856773 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f8s8f" event={"ID":"d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e","Type":"ContainerStarted","Data":"cc8e637fb87fa5da18715c1b8dfa82fbf0ab3c6db3c0617bcba46298faff5727"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.856823 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f8s8f" event={"ID":"d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e","Type":"ContainerStarted","Data":"2586a0ea23c2730a304301484cf409eb85589c2c92823e71794c2e9375e82086"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.857135 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f8s8f" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.857992 4936 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-f8s8f container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.858027 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f8s8f" podUID="d9cb20a8-8718-46d4-aaff-2f17eb5e3b2e" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.865059 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" event={"ID":"80b740c7-e07f-4ca7-be4e-ef5825f6eb24","Type":"ContainerStarted","Data":"367ebad67305915ee445bdb9dd950a30ca2cc42567166db11e5f3be7d890c3de"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.881315 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.882074 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" podStartSLOduration=119.882045457 podStartE2EDuration="1m59.882045457s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:29.852462588 +0000 UTC m=+140.236464889" watchObservedRunningTime="2025-09-30 13:41:29.882045457 +0000 UTC m=+140.266047798" Sep 30 13:41:29 crc kubenswrapper[4936]: E0930 13:41:29.882283 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:30.382272094 +0000 UTC m=+140.766274385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.883231 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7jh2g" podStartSLOduration=119.883216431 podStartE2EDuration="1m59.883216431s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:29.878359549 +0000 UTC m=+140.262361850" watchObservedRunningTime="2025-09-30 13:41:29.883216431 +0000 UTC m=+140.267218772" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.891525 4936 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-7ccwq container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" start-of-body= Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.891582 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" podUID="06045f3a-af69-49c7-9759-915cd9fb4c65" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.884196 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" event={"ID":"06045f3a-af69-49c7-9759-915cd9fb4c65","Type":"ContainerStarted","Data":"2288996231d145866b70b29c7c696474e5d84212d42c50fc0c222f7488f8a12a"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.896455 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.908673 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" event={"ID":"a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f","Type":"ContainerStarted","Data":"8daf1693a5932fe145210cd7ec8209aeb27b7b487d521e7fe1388715836e4e26"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.908736 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" event={"ID":"a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f","Type":"ContainerStarted","Data":"b2cff9db225280d1f476c5b293dc582ebeec76fb57351f3ee7a7d5650dfd75b2"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.910061 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.910984 4936 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7jxz9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.911035 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" podUID="a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.933049 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws" podStartSLOduration=119.933033174 podStartE2EDuration="1m59.933033174s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:29.931254312 +0000 UTC m=+140.315256613" watchObservedRunningTime="2025-09-30 13:41:29.933033174 +0000 UTC m=+140.317035475" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.937917 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6wchs" event={"ID":"b6442e1c-e629-43df-b0ad-adf82f384a6a","Type":"ContainerStarted","Data":"da47346eda0f19223ba0ff130b95ad6dc0526d24350cbe9b17f7367012d1bde2"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.938464 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6wchs" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.942344 4936 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-6wchs container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.942377 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6wchs" podUID="b6442e1c-e629-43df-b0ad-adf82f384a6a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.965131 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-njdd2" event={"ID":"e16bbb17-ee43-4b34-885f-0e042fcde913","Type":"ContainerStarted","Data":"b8a10e3df97cb4509b8ce965ce6e55154135ffc06045396a7174cc933089d6d1"} Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.985260 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:29 crc kubenswrapper[4936]: E0930 13:41:29.985711 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:30.48568497 +0000 UTC m=+140.869687271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.986439 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:29 crc kubenswrapper[4936]: E0930 13:41:29.990109 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:30.49009535 +0000 UTC m=+140.874097651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:29 crc kubenswrapper[4936]: I0930 13:41:29.991356 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-nldp8" event={"ID":"2667a269-9771-4873-8ed1-6781e6aab9bf","Type":"ContainerStarted","Data":"7c576fe841db2132160b41912affd48f4fc835ee15a8e65391c21120cc6a3065"} Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.035566 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75ftv" event={"ID":"6ff5368c-b6b0-4e04-83db-314de1999000","Type":"ContainerStarted","Data":"631227d468cbdc2b7053e4dde29251c6b7587877d6247a001e1153f857bec597"} Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.058517 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bjbnj" event={"ID":"aff95854-e88b-463e-af90-ed33ab4c24bf","Type":"ContainerStarted","Data":"e443d4d62bab604d85d4cf13e593afb54ed18edb9999298d77a5a0a4119ec22c"} Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.061431 4936 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfcbn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.061461 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nfcbn" podUID="d50867fd-81e4-416d-a112-a84b175be026" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.087357 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:30 crc kubenswrapper[4936]: E0930 13:41:30.088521 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:30.5885055 +0000 UTC m=+140.972507801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.191943 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.192571 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f8s8f" podStartSLOduration=120.192551805 podStartE2EDuration="2m0.192551805s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:30.026558941 +0000 UTC m=+140.410561242" watchObservedRunningTime="2025-09-30 13:41:30.192551805 +0000 UTC m=+140.576554106" Sep 30 13:41:30 crc kubenswrapper[4936]: E0930 13:41:30.195452 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:30.695402179 +0000 UTC m=+141.079404480 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.273229 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6wchs" podStartSLOduration=120.273212544 podStartE2EDuration="2m0.273212544s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:30.269875756 +0000 UTC m=+140.653878057" watchObservedRunningTime="2025-09-30 13:41:30.273212544 +0000 UTC m=+140.657214845" Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.294784 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:30 crc kubenswrapper[4936]: E0930 13:41:30.294981 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:30.794956863 +0000 UTC m=+141.178959164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.295104 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:30 crc kubenswrapper[4936]: E0930 13:41:30.295503 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:30.795496149 +0000 UTC m=+141.179498450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.396299 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:30 crc kubenswrapper[4936]: E0930 13:41:30.396452 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:30.896427723 +0000 UTC m=+141.280430024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.396730 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:30 crc kubenswrapper[4936]: E0930 13:41:30.397072 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:30.897060571 +0000 UTC m=+141.281062862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.461124 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" podStartSLOduration=121.461106542 podStartE2EDuration="2m1.461106542s" podCreationTimestamp="2025-09-30 13:39:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:30.407114016 +0000 UTC m=+140.791116327" watchObservedRunningTime="2025-09-30 13:41:30.461106542 +0000 UTC m=+140.845108843" Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.497893 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:30 crc kubenswrapper[4936]: E0930 13:41:30.498203 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:30.998187651 +0000 UTC m=+141.382189952 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.529149 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-nldp8" podStartSLOduration=120.529115439 podStartE2EDuration="2m0.529115439s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:30.486528169 +0000 UTC m=+140.870530470" watchObservedRunningTime="2025-09-30 13:41:30.529115439 +0000 UTC m=+140.913117740" Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.592095 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75ftv" podStartSLOduration=120.592079578 podStartE2EDuration="2m0.592079578s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:30.531944632 +0000 UTC m=+140.915946933" watchObservedRunningTime="2025-09-30 13:41:30.592079578 +0000 UTC m=+140.976081879" Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.600028 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:30 crc kubenswrapper[4936]: E0930 13:41:30.600374 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:31.100359431 +0000 UTC m=+141.484361742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.645376 4936 patch_prober.go:28] interesting pod/router-default-5444994796-sht9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:41:30 crc kubenswrapper[4936]: [-]has-synced failed: reason withheld Sep 30 13:41:30 crc kubenswrapper[4936]: [+]process-running ok Sep 30 13:41:30 crc kubenswrapper[4936]: healthz check failed Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.645908 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sht9l" podUID="2ea3912a-d1e4-4c08-81d1-3788ce6096e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.701146 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:30 crc kubenswrapper[4936]: E0930 13:41:30.701351 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:31.201313946 +0000 UTC m=+141.585316247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.701774 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:30 crc kubenswrapper[4936]: E0930 13:41:30.702069 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:31.202061098 +0000 UTC m=+141.586063399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.736221 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" podStartSLOduration=120.736202681 podStartE2EDuration="2m0.736202681s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:30.599794295 +0000 UTC m=+140.983796596" watchObservedRunningTime="2025-09-30 13:41:30.736202681 +0000 UTC m=+141.120204982" Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.803080 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:30 crc kubenswrapper[4936]: E0930 13:41:30.803264 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:31.303240918 +0000 UTC m=+141.687243219 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.803611 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:30 crc kubenswrapper[4936]: E0930 13:41:30.803907 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:31.303898438 +0000 UTC m=+141.687900739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.905195 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:30 crc kubenswrapper[4936]: E0930 13:41:30.905412 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:31.405328146 +0000 UTC m=+141.789330447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:30 crc kubenswrapper[4936]: I0930 13:41:30.905744 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:30 crc kubenswrapper[4936]: E0930 13:41:30.906120 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:31.406096469 +0000 UTC m=+141.790098770 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.006441 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:31 crc kubenswrapper[4936]: E0930 13:41:31.006630 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:31.50660559 +0000 UTC m=+141.890607881 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.007002 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:31 crc kubenswrapper[4936]: E0930 13:41:31.007444 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:31.507430745 +0000 UTC m=+141.891433046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.087987 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7jh2g" event={"ID":"28e6aa1b-b20a-4017-9659-a36c70e1484b","Type":"ContainerStarted","Data":"1258e9febaa1c40125adb565f897f54fb5a0c433b37e0eb3bac37845c8607d65"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.107875 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:31 crc kubenswrapper[4936]: E0930 13:41:31.108213 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:31.608199974 +0000 UTC m=+141.992202275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.120787 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" event={"ID":"1664a2e3-61da-4eea-bdba-06b422cbb9b6","Type":"ContainerStarted","Data":"07f6f5c150c67c45d5fc182de50e8ed89516d402f256c3d8b4bb258013e02858"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.131806 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg" event={"ID":"3d9b017e-f89a-410c-a6f1-7db0bc934b79","Type":"ContainerStarted","Data":"83c1c0f36fb193308bd2a1f8597a977c819d473bb9ec0fc386bbee9648639c7f"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.131849 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg" event={"ID":"3d9b017e-f89a-410c-a6f1-7db0bc934b79","Type":"ContainerStarted","Data":"42ff413ac3d8024c32c6d3e65a5aa3852ef88fb184c28aeb8b37fc258cb8435e"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.140636 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v22qc" event={"ID":"a231f851-06a5-4424-8631-04c71f2ccf7a","Type":"ContainerStarted","Data":"b16f1c44c88c765ddfcb3626fb5ba7c9b4da6b796865269f2f45bb2545892e55"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.141147 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v22qc" Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.151502 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fg254" event={"ID":"fd49f3ee-bb75-4a88-9b5d-207de98a4d0b","Type":"ContainerStarted","Data":"1c80e4252d073c6d15477d47f2e27b5e05bd2bc99921387bd6309dcad0885811"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.157464 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-75ftv" event={"ID":"6ff5368c-b6b0-4e04-83db-314de1999000","Type":"ContainerStarted","Data":"6944c243d5bad510e296f84a4cce5af3122fab0f9f4436ca7d4f83a6a0d76676"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.159765 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz" event={"ID":"661fcd49-29e9-4299-8fa7-9696bb5d1944","Type":"ContainerStarted","Data":"227d7785a4d3ba90022b984891dc38dd3c518d3b85c575898a13d368fb20d8e1"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.161443 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-plg5s" event={"ID":"6a0172e4-1900-4476-bc0d-b2bf5cd281e5","Type":"ContainerStarted","Data":"a3c542b0cbf1898616e7bccabbcc4f3662a1a1f902cb7a75d58c4bd3ce8651d8"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.169583 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hj57l" event={"ID":"a641ed11-580d-41ac-967c-e145d80b03fa","Type":"ContainerStarted","Data":"be90a8064ddb723013b7dc60d99eca4d5b0b6432e5cb4dd01509dbc873cc1a29"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.169619 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hj57l" event={"ID":"a641ed11-580d-41ac-967c-e145d80b03fa","Type":"ContainerStarted","Data":"880de8ef718f46cf887191d6ca6f636cdefbb05f7e0d060722eaf254c5892fa6"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.182647 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" event={"ID":"80b740c7-e07f-4ca7-be4e-ef5825f6eb24","Type":"ContainerStarted","Data":"e7f9e402345667407fbabc7b5b4c2413cd69da9b086c0e06d22b417a471f82ee"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.184056 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bjbnj" event={"ID":"aff95854-e88b-463e-af90-ed33ab4c24bf","Type":"ContainerStarted","Data":"db437a72db738b65714cf1108d96fbe03ab48b64145814a2f92b416e60c38dff"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.184084 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bjbnj" event={"ID":"aff95854-e88b-463e-af90-ed33ab4c24bf","Type":"ContainerStarted","Data":"ada4387f4a5e96bb0d0f10edf6614640348e29eba521cb3aeaeedffffa4afdec"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.184547 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-bjbnj" Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.190700 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-njdd2" event={"ID":"e16bbb17-ee43-4b34-885f-0e042fcde913","Type":"ContainerStarted","Data":"3b747831d1342a4544f88b9fe3c0d51a57de20ad94ee932ad32ca225eeb1dedb"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.195268 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-29b8m" event={"ID":"54105c99-d0f2-4653-b0b5-e3af3b8a118a","Type":"ContainerStarted","Data":"29e9e50a33221caa6fc03a5004e5ba8657bb63391baa4cf737eb99c47e8f07b8"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.199867 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c49vt" event={"ID":"28f20487-8292-428a-b17e-2bb85ad6a0d6","Type":"ContainerStarted","Data":"53f701400e7682107930bbb7808ee7e988b6f3dc3b90a10422465edb2603f3b7"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.199893 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c49vt" event={"ID":"28f20487-8292-428a-b17e-2bb85ad6a0d6","Type":"ContainerStarted","Data":"77f5d41b42b6176da89ea8ed083e1f9fdd8a9e880be19cef4b9f0eeb91f70181"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.205850 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdtws" event={"ID":"29789917-34b7-4bc9-8b49-a00b5055f092","Type":"ContainerStarted","Data":"2c1c8aeea1f7dca2c8f190b116b3a1ef8e90d19d1b64f79a9cfd363f469e661d"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.209483 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:31 crc kubenswrapper[4936]: E0930 13:41:31.211541 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:31.711529268 +0000 UTC m=+142.095531569 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.218354 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" event={"ID":"9d8cedf4-9075-4715-9248-53ad5d391a28","Type":"ContainerStarted","Data":"8ee11cc229ab16eaa065d98ab8daca05f39a7ea972d64e671a43be7f7f595720"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.222846 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-nldp8" event={"ID":"2667a269-9771-4873-8ed1-6781e6aab9bf","Type":"ContainerStarted","Data":"b2583a28947aa9830de67d52d6e3be8dad96c0976d1bc9e7eaa9937361a6e917"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.235366 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-n8p2c" event={"ID":"0e7e66ba-3c9e-476f-9bfc-d4f11d95d195","Type":"ContainerStarted","Data":"1aa3036a2b43fe63fc240bd0be311a43f38d2a5db45ba0eeca891f885267f1f1"} Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.236750 4936 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7jxz9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.236793 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" podUID="a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.238388 4936 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-6nvzg container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.238522 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" podUID="0c8003ab-2870-41d5-a1c7-30dd4232d184" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.250800 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6wchs" Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.263304 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" podStartSLOduration=121.263288178 podStartE2EDuration="2m1.263288178s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:31.253742908 +0000 UTC m=+141.637745209" watchObservedRunningTime="2025-09-30 13:41:31.263288178 +0000 UTC m=+141.647290479" Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.309993 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:31 crc kubenswrapper[4936]: E0930 13:41:31.310325 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:31.810282288 +0000 UTC m=+142.194284589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.311222 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:31 crc kubenswrapper[4936]: E0930 13:41:31.311685 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:31.811669989 +0000 UTC m=+142.195672290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.370906 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f8s8f" Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.378439 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz" podStartSLOduration=121.37842545 podStartE2EDuration="2m1.37842545s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:31.377052259 +0000 UTC m=+141.761054570" watchObservedRunningTime="2025-09-30 13:41:31.37842545 +0000 UTC m=+141.762427741" Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.425177 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-29b8m" podStartSLOduration=121.425163742 podStartE2EDuration="2m1.425163742s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:31.423646868 +0000 UTC m=+141.807649169" watchObservedRunningTime="2025-09-30 13:41:31.425163742 +0000 UTC m=+141.809166043" Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.431813 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:31 crc kubenswrapper[4936]: E0930 13:41:31.432163 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:31.932147927 +0000 UTC m=+142.316150228 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.533521 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:31 crc kubenswrapper[4936]: E0930 13:41:31.534859 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:32.034843023 +0000 UTC m=+142.418845324 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.591741 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c49vt" podStartSLOduration=121.591726004 podStartE2EDuration="2m1.591726004s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:31.590174928 +0000 UTC m=+141.974177229" watchObservedRunningTime="2025-09-30 13:41:31.591726004 +0000 UTC m=+141.975728305" Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.635169 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:31 crc kubenswrapper[4936]: E0930 13:41:31.635533 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:32.13551753 +0000 UTC m=+142.519519831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.643185 4936 patch_prober.go:28] interesting pod/router-default-5444994796-sht9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:41:31 crc kubenswrapper[4936]: [-]has-synced failed: reason withheld Sep 30 13:41:31 crc kubenswrapper[4936]: [+]process-running ok Sep 30 13:41:31 crc kubenswrapper[4936]: healthz check failed Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.643594 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sht9l" podUID="2ea3912a-d1e4-4c08-81d1-3788ce6096e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.646071 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bjbnj" podStartSLOduration=8.646052469 podStartE2EDuration="8.646052469s" podCreationTimestamp="2025-09-30 13:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:31.639829556 +0000 UTC m=+142.023831857" watchObservedRunningTime="2025-09-30 13:41:31.646052469 +0000 UTC m=+142.030054770" Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.737074 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:31 crc kubenswrapper[4936]: E0930 13:41:31.737507 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:32.237478984 +0000 UTC m=+142.621481285 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.764975 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v22qc" podStartSLOduration=121.764957931 podStartE2EDuration="2m1.764957931s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:31.716594941 +0000 UTC m=+142.100597242" watchObservedRunningTime="2025-09-30 13:41:31.764957931 +0000 UTC m=+142.148960232" Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.766093 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-hj57l" podStartSLOduration=122.766087774 podStartE2EDuration="2m2.766087774s" podCreationTimestamp="2025-09-30 13:39:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:31.764471377 +0000 UTC m=+142.148473688" watchObservedRunningTime="2025-09-30 13:41:31.766087774 +0000 UTC m=+142.150090075" Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.781399 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g69gg" podStartSLOduration=121.781383943 podStartE2EDuration="2m1.781383943s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:31.779505178 +0000 UTC m=+142.163507479" watchObservedRunningTime="2025-09-30 13:41:31.781383943 +0000 UTC m=+142.165386244" Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.838709 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:31 crc kubenswrapper[4936]: E0930 13:41:31.838884 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:32.338859831 +0000 UTC m=+142.722862132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.839200 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:31 crc kubenswrapper[4936]: E0930 13:41:31.839592 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:32.339573832 +0000 UTC m=+142.723576133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.940207 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:31 crc kubenswrapper[4936]: E0930 13:41:31.940394 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:32.440366262 +0000 UTC m=+142.824368563 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.940723 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:31 crc kubenswrapper[4936]: E0930 13:41:31.940968 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:32.440957229 +0000 UTC m=+142.824959530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.943999 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-n8p2c" podStartSLOduration=121.943982198 podStartE2EDuration="2m1.943982198s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:31.942192786 +0000 UTC m=+142.326195097" watchObservedRunningTime="2025-09-30 13:41:31.943982198 +0000 UTC m=+142.327984489" Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.944573 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-njdd2" podStartSLOduration=121.944565165 podStartE2EDuration="2m1.944565165s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:31.894257398 +0000 UTC m=+142.278259699" watchObservedRunningTime="2025-09-30 13:41:31.944565165 +0000 UTC m=+142.328567466" Sep 30 13:41:31 crc kubenswrapper[4936]: I0930 13:41:31.995909 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-plg5s" podStartSLOduration=8.995889803 podStartE2EDuration="8.995889803s" podCreationTimestamp="2025-09-30 13:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:31.992794162 +0000 UTC m=+142.376796463" watchObservedRunningTime="2025-09-30 13:41:31.995889803 +0000 UTC m=+142.379892104" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.042161 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:32 crc kubenswrapper[4936]: E0930 13:41:32.042309 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:32.542286295 +0000 UTC m=+142.926288596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.042631 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:32 crc kubenswrapper[4936]: E0930 13:41:32.042995 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:32.542978805 +0000 UTC m=+142.926981106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.143834 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:32 crc kubenswrapper[4936]: E0930 13:41:32.144260 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:32.644244809 +0000 UTC m=+143.028247100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.173241 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hsdwj"] Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.174113 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hsdwj" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.184241 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.201892 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hsdwj"] Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.222488 4936 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-hbzpb container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.222564 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" podUID="9d8cedf4-9075-4715-9248-53ad5d391a28" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.234886 4936 patch_prober.go:28] interesting pod/console-operator-58897d9998-6rwjr container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.234942 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-6rwjr" podUID="c2ccd121-a5fc-4f11-b256-70b80420eb21" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.236472 4936 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-7ccwq container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.7:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.236530 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" podUID="06045f3a-af69-49c7-9759-915cd9fb4c65" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.7:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.244945 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:32 crc kubenswrapper[4936]: E0930 13:41:32.245223 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:32.745211884 +0000 UTC m=+143.129214185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.245317 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eea297dc-9577-48fb-b7b4-e1731a2d44d7-utilities\") pod \"community-operators-hsdwj\" (UID: \"eea297dc-9577-48fb-b7b4-e1731a2d44d7\") " pod="openshift-marketplace/community-operators-hsdwj" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.245373 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9tmr\" (UniqueName: \"kubernetes.io/projected/eea297dc-9577-48fb-b7b4-e1731a2d44d7-kube-api-access-n9tmr\") pod \"community-operators-hsdwj\" (UID: \"eea297dc-9577-48fb-b7b4-e1731a2d44d7\") " pod="openshift-marketplace/community-operators-hsdwj" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.245395 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eea297dc-9577-48fb-b7b4-e1731a2d44d7-catalog-content\") pod \"community-operators-hsdwj\" (UID: \"eea297dc-9577-48fb-b7b4-e1731a2d44d7\") " pod="openshift-marketplace/community-operators-hsdwj" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.304621 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" event={"ID":"80b740c7-e07f-4ca7-be4e-ef5825f6eb24","Type":"ContainerStarted","Data":"5f3497a4e8fe0d9fd4e8d44fc0b81c1b987232550632eba3136b74e5ef8e235d"} Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.308603 4936 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7jxz9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.308665 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" podUID="a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.353263 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.353578 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9tmr\" (UniqueName: \"kubernetes.io/projected/eea297dc-9577-48fb-b7b4-e1731a2d44d7-kube-api-access-n9tmr\") pod \"community-operators-hsdwj\" (UID: \"eea297dc-9577-48fb-b7b4-e1731a2d44d7\") " pod="openshift-marketplace/community-operators-hsdwj" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.353605 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eea297dc-9577-48fb-b7b4-e1731a2d44d7-catalog-content\") pod \"community-operators-hsdwj\" (UID: \"eea297dc-9577-48fb-b7b4-e1731a2d44d7\") " pod="openshift-marketplace/community-operators-hsdwj" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.353726 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eea297dc-9577-48fb-b7b4-e1731a2d44d7-utilities\") pod \"community-operators-hsdwj\" (UID: \"eea297dc-9577-48fb-b7b4-e1731a2d44d7\") " pod="openshift-marketplace/community-operators-hsdwj" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.354105 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eea297dc-9577-48fb-b7b4-e1731a2d44d7-utilities\") pod \"community-operators-hsdwj\" (UID: \"eea297dc-9577-48fb-b7b4-e1731a2d44d7\") " pod="openshift-marketplace/community-operators-hsdwj" Sep 30 13:41:32 crc kubenswrapper[4936]: E0930 13:41:32.354177 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:32.854159504 +0000 UTC m=+143.238161805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.354726 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eea297dc-9577-48fb-b7b4-e1731a2d44d7-catalog-content\") pod \"community-operators-hsdwj\" (UID: \"eea297dc-9577-48fb-b7b4-e1731a2d44d7\") " pod="openshift-marketplace/community-operators-hsdwj" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.393035 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m6dxk"] Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.393993 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m6dxk" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.396906 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m6dxk"] Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.401289 4936 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-6nvzg container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.401429 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" podUID="0c8003ab-2870-41d5-a1c7-30dd4232d184" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.402431 4936 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-6nvzg container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.402544 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" podUID="0c8003ab-2870-41d5-a1c7-30dd4232d184" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.426786 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.456305 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:32 crc kubenswrapper[4936]: E0930 13:41:32.461872 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:32.961856536 +0000 UTC m=+143.345858837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.488111 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9tmr\" (UniqueName: \"kubernetes.io/projected/eea297dc-9577-48fb-b7b4-e1731a2d44d7-kube-api-access-n9tmr\") pod \"community-operators-hsdwj\" (UID: \"eea297dc-9577-48fb-b7b4-e1731a2d44d7\") " pod="openshift-marketplace/community-operators-hsdwj" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.502141 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hsdwj" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.545168 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rh7ln"] Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.546090 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rh7ln" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.558792 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.559124 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc486\" (UniqueName: \"kubernetes.io/projected/25d411f6-43fa-4ee8-a820-b5f77a3d94ac-kube-api-access-xc486\") pod \"certified-operators-m6dxk\" (UID: \"25d411f6-43fa-4ee8-a820-b5f77a3d94ac\") " pod="openshift-marketplace/certified-operators-m6dxk" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.559151 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25d411f6-43fa-4ee8-a820-b5f77a3d94ac-utilities\") pod \"certified-operators-m6dxk\" (UID: \"25d411f6-43fa-4ee8-a820-b5f77a3d94ac\") " pod="openshift-marketplace/certified-operators-m6dxk" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.559170 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25d411f6-43fa-4ee8-a820-b5f77a3d94ac-catalog-content\") pod \"certified-operators-m6dxk\" (UID: \"25d411f6-43fa-4ee8-a820-b5f77a3d94ac\") " pod="openshift-marketplace/certified-operators-m6dxk" Sep 30 13:41:32 crc kubenswrapper[4936]: E0930 13:41:32.559310 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:33.059295948 +0000 UTC m=+143.443298249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.660991 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.661040 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/882bdf58-1728-423e-94c1-79c8dad7ab55-utilities\") pod \"community-operators-rh7ln\" (UID: \"882bdf58-1728-423e-94c1-79c8dad7ab55\") " pod="openshift-marketplace/community-operators-rh7ln" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.661078 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tj4v\" (UniqueName: \"kubernetes.io/projected/882bdf58-1728-423e-94c1-79c8dad7ab55-kube-api-access-4tj4v\") pod \"community-operators-rh7ln\" (UID: \"882bdf58-1728-423e-94c1-79c8dad7ab55\") " pod="openshift-marketplace/community-operators-rh7ln" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.661104 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc486\" (UniqueName: \"kubernetes.io/projected/25d411f6-43fa-4ee8-a820-b5f77a3d94ac-kube-api-access-xc486\") pod \"certified-operators-m6dxk\" (UID: \"25d411f6-43fa-4ee8-a820-b5f77a3d94ac\") " pod="openshift-marketplace/certified-operators-m6dxk" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.661117 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/882bdf58-1728-423e-94c1-79c8dad7ab55-catalog-content\") pod \"community-operators-rh7ln\" (UID: \"882bdf58-1728-423e-94c1-79c8dad7ab55\") " pod="openshift-marketplace/community-operators-rh7ln" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.661135 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25d411f6-43fa-4ee8-a820-b5f77a3d94ac-utilities\") pod \"certified-operators-m6dxk\" (UID: \"25d411f6-43fa-4ee8-a820-b5f77a3d94ac\") " pod="openshift-marketplace/certified-operators-m6dxk" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.661150 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25d411f6-43fa-4ee8-a820-b5f77a3d94ac-catalog-content\") pod \"certified-operators-m6dxk\" (UID: \"25d411f6-43fa-4ee8-a820-b5f77a3d94ac\") " pod="openshift-marketplace/certified-operators-m6dxk" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.661531 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25d411f6-43fa-4ee8-a820-b5f77a3d94ac-catalog-content\") pod \"certified-operators-m6dxk\" (UID: \"25d411f6-43fa-4ee8-a820-b5f77a3d94ac\") " pod="openshift-marketplace/certified-operators-m6dxk" Sep 30 13:41:32 crc kubenswrapper[4936]: E0930 13:41:32.661601 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:33.161589572 +0000 UTC m=+143.545591863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.661941 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25d411f6-43fa-4ee8-a820-b5f77a3d94ac-utilities\") pod \"certified-operators-m6dxk\" (UID: \"25d411f6-43fa-4ee8-a820-b5f77a3d94ac\") " pod="openshift-marketplace/certified-operators-m6dxk" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.669549 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hbzpb" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.669863 4936 patch_prober.go:28] interesting pod/router-default-5444994796-sht9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:41:32 crc kubenswrapper[4936]: [-]has-synced failed: reason withheld Sep 30 13:41:32 crc kubenswrapper[4936]: [+]process-running ok Sep 30 13:41:32 crc kubenswrapper[4936]: healthz check failed Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.669895 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sht9l" podUID="2ea3912a-d1e4-4c08-81d1-3788ce6096e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.745320 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rh7ln"] Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.763416 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.763615 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/882bdf58-1728-423e-94c1-79c8dad7ab55-utilities\") pod \"community-operators-rh7ln\" (UID: \"882bdf58-1728-423e-94c1-79c8dad7ab55\") " pod="openshift-marketplace/community-operators-rh7ln" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.763658 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tj4v\" (UniqueName: \"kubernetes.io/projected/882bdf58-1728-423e-94c1-79c8dad7ab55-kube-api-access-4tj4v\") pod \"community-operators-rh7ln\" (UID: \"882bdf58-1728-423e-94c1-79c8dad7ab55\") " pod="openshift-marketplace/community-operators-rh7ln" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.763684 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/882bdf58-1728-423e-94c1-79c8dad7ab55-catalog-content\") pod \"community-operators-rh7ln\" (UID: \"882bdf58-1728-423e-94c1-79c8dad7ab55\") " pod="openshift-marketplace/community-operators-rh7ln" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.764107 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/882bdf58-1728-423e-94c1-79c8dad7ab55-catalog-content\") pod \"community-operators-rh7ln\" (UID: \"882bdf58-1728-423e-94c1-79c8dad7ab55\") " pod="openshift-marketplace/community-operators-rh7ln" Sep 30 13:41:32 crc kubenswrapper[4936]: E0930 13:41:32.764232 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:33.264218806 +0000 UTC m=+143.648221107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.764453 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/882bdf58-1728-423e-94c1-79c8dad7ab55-utilities\") pod \"community-operators-rh7ln\" (UID: \"882bdf58-1728-423e-94c1-79c8dad7ab55\") " pod="openshift-marketplace/community-operators-rh7ln" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.855538 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc486\" (UniqueName: \"kubernetes.io/projected/25d411f6-43fa-4ee8-a820-b5f77a3d94ac-kube-api-access-xc486\") pod \"certified-operators-m6dxk\" (UID: \"25d411f6-43fa-4ee8-a820-b5f77a3d94ac\") " pod="openshift-marketplace/certified-operators-m6dxk" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.864172 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hl59l"] Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.864735 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:32 crc kubenswrapper[4936]: E0930 13:41:32.865019 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:33.365004506 +0000 UTC m=+143.749006807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.865084 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hl59l" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.918369 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tj4v\" (UniqueName: \"kubernetes.io/projected/882bdf58-1728-423e-94c1-79c8dad7ab55-kube-api-access-4tj4v\") pod \"community-operators-rh7ln\" (UID: \"882bdf58-1728-423e-94c1-79c8dad7ab55\") " pod="openshift-marketplace/community-operators-rh7ln" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.928883 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hl59l"] Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.966073 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.966837 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1141335-cd5a-4799-b931-d1e24f3d0044-utilities\") pod \"certified-operators-hl59l\" (UID: \"f1141335-cd5a-4799-b931-d1e24f3d0044\") " pod="openshift-marketplace/certified-operators-hl59l" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.966908 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddf4j\" (UniqueName: \"kubernetes.io/projected/f1141335-cd5a-4799-b931-d1e24f3d0044-kube-api-access-ddf4j\") pod \"certified-operators-hl59l\" (UID: \"f1141335-cd5a-4799-b931-d1e24f3d0044\") " pod="openshift-marketplace/certified-operators-hl59l" Sep 30 13:41:32 crc kubenswrapper[4936]: I0930 13:41:32.966953 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1141335-cd5a-4799-b931-d1e24f3d0044-catalog-content\") pod \"certified-operators-hl59l\" (UID: \"f1141335-cd5a-4799-b931-d1e24f3d0044\") " pod="openshift-marketplace/certified-operators-hl59l" Sep 30 13:41:32 crc kubenswrapper[4936]: E0930 13:41:32.967078 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:33.467061783 +0000 UTC m=+143.851064084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.016704 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m6dxk" Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.067706 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.067757 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1141335-cd5a-4799-b931-d1e24f3d0044-utilities\") pod \"certified-operators-hl59l\" (UID: \"f1141335-cd5a-4799-b931-d1e24f3d0044\") " pod="openshift-marketplace/certified-operators-hl59l" Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.067828 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddf4j\" (UniqueName: \"kubernetes.io/projected/f1141335-cd5a-4799-b931-d1e24f3d0044-kube-api-access-ddf4j\") pod \"certified-operators-hl59l\" (UID: \"f1141335-cd5a-4799-b931-d1e24f3d0044\") " pod="openshift-marketplace/certified-operators-hl59l" Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.067862 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1141335-cd5a-4799-b931-d1e24f3d0044-catalog-content\") pod \"certified-operators-hl59l\" (UID: \"f1141335-cd5a-4799-b931-d1e24f3d0044\") " pod="openshift-marketplace/certified-operators-hl59l" Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.068377 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1141335-cd5a-4799-b931-d1e24f3d0044-catalog-content\") pod \"certified-operators-hl59l\" (UID: \"f1141335-cd5a-4799-b931-d1e24f3d0044\") " pod="openshift-marketplace/certified-operators-hl59l" Sep 30 13:41:33 crc kubenswrapper[4936]: E0930 13:41:33.068672 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:33.568639106 +0000 UTC m=+143.952641407 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.069083 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1141335-cd5a-4799-b931-d1e24f3d0044-utilities\") pod \"certified-operators-hl59l\" (UID: \"f1141335-cd5a-4799-b931-d1e24f3d0044\") " pod="openshift-marketplace/certified-operators-hl59l" Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.168866 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:33 crc kubenswrapper[4936]: E0930 13:41:33.169266 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:33.669252671 +0000 UTC m=+144.053254972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.201949 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rh7ln" Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.257277 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddf4j\" (UniqueName: \"kubernetes.io/projected/f1141335-cd5a-4799-b931-d1e24f3d0044-kube-api-access-ddf4j\") pod \"certified-operators-hl59l\" (UID: \"f1141335-cd5a-4799-b931-d1e24f3d0044\") " pod="openshift-marketplace/certified-operators-hl59l" Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.270406 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:33 crc kubenswrapper[4936]: E0930 13:41:33.270850 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:33.770835664 +0000 UTC m=+144.154837965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.304992 4936 patch_prober.go:28] interesting pod/console-operator-58897d9998-6rwjr container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.305053 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-6rwjr" podUID="c2ccd121-a5fc-4f11-b256-70b80420eb21" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.349036 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" event={"ID":"80b740c7-e07f-4ca7-be4e-ef5825f6eb24","Type":"ContainerStarted","Data":"f466d508d2073f476d1fa74c15a33f9b1ffb18aa317b0a8e25233523961b2c06"} Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.371217 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:33 crc kubenswrapper[4936]: E0930 13:41:33.371956 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:33.871929232 +0000 UTC m=+144.255931533 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.375071 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hsdwj"] Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.380972 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:33 crc kubenswrapper[4936]: E0930 13:41:33.382127 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:33.882107581 +0000 UTC m=+144.266109882 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.404453 4936 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-6nvzg container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.404787 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" podUID="0c8003ab-2870-41d5-a1c7-30dd4232d184" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.483123 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:33 crc kubenswrapper[4936]: E0930 13:41:33.483284 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:33.983264682 +0000 UTC m=+144.367266983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.483364 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:33 crc kubenswrapper[4936]: E0930 13:41:33.483894 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:33.98387758 +0000 UTC m=+144.367879881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.510198 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hl59l" Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.584794 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:33 crc kubenswrapper[4936]: E0930 13:41:33.584931 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:34.084908687 +0000 UTC m=+144.468910988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.585184 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:33 crc kubenswrapper[4936]: E0930 13:41:33.585532 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:34.085524535 +0000 UTC m=+144.469526836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.645753 4936 patch_prober.go:28] interesting pod/router-default-5444994796-sht9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:41:33 crc kubenswrapper[4936]: [-]has-synced failed: reason withheld Sep 30 13:41:33 crc kubenswrapper[4936]: [+]process-running ok Sep 30 13:41:33 crc kubenswrapper[4936]: healthz check failed Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.645841 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sht9l" podUID="2ea3912a-d1e4-4c08-81d1-3788ce6096e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.686739 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:33 crc kubenswrapper[4936]: E0930 13:41:33.687311 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:34.187296484 +0000 UTC m=+144.571298785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.790111 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:33 crc kubenswrapper[4936]: E0930 13:41:33.790472 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:34.290460273 +0000 UTC m=+144.674462574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.891215 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:33 crc kubenswrapper[4936]: E0930 13:41:33.891761 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:34.391720937 +0000 UTC m=+144.775723238 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:33 crc kubenswrapper[4936]: I0930 13:41:33.993030 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:33 crc kubenswrapper[4936]: E0930 13:41:33.993419 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:34.493408413 +0000 UTC m=+144.877410704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.095036 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:34 crc kubenswrapper[4936]: E0930 13:41:34.095455 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:34.59543919 +0000 UTC m=+144.979441491 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.196856 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:34 crc kubenswrapper[4936]: E0930 13:41:34.197250 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:34.697231749 +0000 UTC m=+145.081234100 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.269245 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rh7ln"] Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.298603 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:34 crc kubenswrapper[4936]: E0930 13:41:34.298846 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:34.798810282 +0000 UTC m=+145.182812583 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.299070 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:34 crc kubenswrapper[4936]: E0930 13:41:34.299466 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:34.799458611 +0000 UTC m=+145.183460912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:34 crc kubenswrapper[4936]: W0930 13:41:34.313770 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod882bdf58_1728_423e_94c1_79c8dad7ab55.slice/crio-ac1354579bc6dbda24a29b80345e8d4ecbba636ff1d7836a7f2613836c20a8ab WatchSource:0}: Error finding container ac1354579bc6dbda24a29b80345e8d4ecbba636ff1d7836a7f2613836c20a8ab: Status 404 returned error can't find the container with id ac1354579bc6dbda24a29b80345e8d4ecbba636ff1d7836a7f2613836c20a8ab Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.375054 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh7ln" event={"ID":"882bdf58-1728-423e-94c1-79c8dad7ab55","Type":"ContainerStarted","Data":"ac1354579bc6dbda24a29b80345e8d4ecbba636ff1d7836a7f2613836c20a8ab"} Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.377587 4936 generic.go:334] "Generic (PLEG): container finished" podID="eea297dc-9577-48fb-b7b4-e1731a2d44d7" containerID="2c23346b2e3a0886ed59e890c9ee85f6d9dfda3cd1941f6adc44d555dae4132f" exitCode=0 Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.377846 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsdwj" event={"ID":"eea297dc-9577-48fb-b7b4-e1731a2d44d7","Type":"ContainerDied","Data":"2c23346b2e3a0886ed59e890c9ee85f6d9dfda3cd1941f6adc44d555dae4132f"} Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.377899 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsdwj" event={"ID":"eea297dc-9577-48fb-b7b4-e1731a2d44d7","Type":"ContainerStarted","Data":"4abc791839984b959fd82524c274906aee4821f895a3a3cec4b2312444457021"} Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.379815 4936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.389966 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" event={"ID":"80b740c7-e07f-4ca7-be4e-ef5825f6eb24","Type":"ContainerStarted","Data":"293dd39647747e6a812851c1c9c947872f4a2ca202f7e036c98bd21461b6c563"} Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.401253 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:34 crc kubenswrapper[4936]: E0930 13:41:34.401536 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:34.901519787 +0000 UTC m=+145.285522088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.401673 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:34 crc kubenswrapper[4936]: E0930 13:41:34.402353 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:34.902325561 +0000 UTC m=+145.286327862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.418769 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6nvzg" Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.481290 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m6dxk"] Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.503043 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-mqxlc" podStartSLOduration=11.503025458 podStartE2EDuration="11.503025458s" podCreationTimestamp="2025-09-30 13:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:34.502848803 +0000 UTC m=+144.886851114" watchObservedRunningTime="2025-09-30 13:41:34.503025458 +0000 UTC m=+144.887027749" Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.503293 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:34 crc kubenswrapper[4936]: E0930 13:41:34.504413 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:35.004394718 +0000 UTC m=+145.388397019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.527379 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sbjhk"] Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.528289 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbjhk" Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.533266 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 13:41:34 crc kubenswrapper[4936]: W0930 13:41:34.542054 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25d411f6_43fa_4ee8_a820_b5f77a3d94ac.slice/crio-67c47c2b56cfe0dd95ad4c644890a2c2a852ac78e82261ba734416593fa8c223 WatchSource:0}: Error finding container 67c47c2b56cfe0dd95ad4c644890a2c2a852ac78e82261ba734416593fa8c223: Status 404 returned error can't find the container with id 67c47c2b56cfe0dd95ad4c644890a2c2a852ac78e82261ba734416593fa8c223 Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.607328 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.607398 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba953910-a090-4e61-88f2-f1718da34ce3-utilities\") pod \"redhat-marketplace-sbjhk\" (UID: \"ba953910-a090-4e61-88f2-f1718da34ce3\") " pod="openshift-marketplace/redhat-marketplace-sbjhk" Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.607472 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba953910-a090-4e61-88f2-f1718da34ce3-catalog-content\") pod \"redhat-marketplace-sbjhk\" (UID: \"ba953910-a090-4e61-88f2-f1718da34ce3\") " pod="openshift-marketplace/redhat-marketplace-sbjhk" Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.607512 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxn4d\" (UniqueName: \"kubernetes.io/projected/ba953910-a090-4e61-88f2-f1718da34ce3-kube-api-access-rxn4d\") pod \"redhat-marketplace-sbjhk\" (UID: \"ba953910-a090-4e61-88f2-f1718da34ce3\") " pod="openshift-marketplace/redhat-marketplace-sbjhk" Sep 30 13:41:34 crc kubenswrapper[4936]: E0930 13:41:34.607837 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:35.107826306 +0000 UTC m=+145.491828607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.620378 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbjhk"] Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.644960 4936 patch_prober.go:28] interesting pod/router-default-5444994796-sht9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:41:34 crc kubenswrapper[4936]: [-]has-synced failed: reason withheld Sep 30 13:41:34 crc kubenswrapper[4936]: [+]process-running ok Sep 30 13:41:34 crc kubenswrapper[4936]: healthz check failed Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.645028 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sht9l" podUID="2ea3912a-d1e4-4c08-81d1-3788ce6096e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.709227 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:34 crc kubenswrapper[4936]: E0930 13:41:34.709408 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:35.209368498 +0000 UTC m=+145.593370789 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.709684 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba953910-a090-4e61-88f2-f1718da34ce3-catalog-content\") pod \"redhat-marketplace-sbjhk\" (UID: \"ba953910-a090-4e61-88f2-f1718da34ce3\") " pod="openshift-marketplace/redhat-marketplace-sbjhk" Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.709772 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxn4d\" (UniqueName: \"kubernetes.io/projected/ba953910-a090-4e61-88f2-f1718da34ce3-kube-api-access-rxn4d\") pod \"redhat-marketplace-sbjhk\" (UID: \"ba953910-a090-4e61-88f2-f1718da34ce3\") " pod="openshift-marketplace/redhat-marketplace-sbjhk" Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.709904 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.710000 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba953910-a090-4e61-88f2-f1718da34ce3-utilities\") pod \"redhat-marketplace-sbjhk\" (UID: \"ba953910-a090-4e61-88f2-f1718da34ce3\") " pod="openshift-marketplace/redhat-marketplace-sbjhk" Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.710292 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba953910-a090-4e61-88f2-f1718da34ce3-utilities\") pod \"redhat-marketplace-sbjhk\" (UID: \"ba953910-a090-4e61-88f2-f1718da34ce3\") " pod="openshift-marketplace/redhat-marketplace-sbjhk" Sep 30 13:41:34 crc kubenswrapper[4936]: E0930 13:41:34.710714 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:35.210550122 +0000 UTC m=+145.594552423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.710903 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba953910-a090-4e61-88f2-f1718da34ce3-catalog-content\") pod \"redhat-marketplace-sbjhk\" (UID: \"ba953910-a090-4e61-88f2-f1718da34ce3\") " pod="openshift-marketplace/redhat-marketplace-sbjhk" Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.737705 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxn4d\" (UniqueName: \"kubernetes.io/projected/ba953910-a090-4e61-88f2-f1718da34ce3-kube-api-access-rxn4d\") pod \"redhat-marketplace-sbjhk\" (UID: \"ba953910-a090-4e61-88f2-f1718da34ce3\") " pod="openshift-marketplace/redhat-marketplace-sbjhk" Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.810999 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:34 crc kubenswrapper[4936]: E0930 13:41:34.811301 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:35.311286921 +0000 UTC m=+145.695289212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.824653 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hl59l"] Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.873785 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbjhk" Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.914072 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:34 crc kubenswrapper[4936]: E0930 13:41:34.914407 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:35.414394599 +0000 UTC m=+145.798396900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:34 crc kubenswrapper[4936]: W0930 13:41:34.915508 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1141335_cd5a_4799_b931_d1e24f3d0044.slice/crio-0af7359bc3df7f35686b62dde7572475883a453d9933091aef3e1d2f0cb5caf2 WatchSource:0}: Error finding container 0af7359bc3df7f35686b62dde7572475883a453d9933091aef3e1d2f0cb5caf2: Status 404 returned error can't find the container with id 0af7359bc3df7f35686b62dde7572475883a453d9933091aef3e1d2f0cb5caf2 Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.920286 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-thjx8"] Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.921289 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-thjx8" Sep 30 13:41:34 crc kubenswrapper[4936]: I0930 13:41:34.926472 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-thjx8"] Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.015030 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.015253 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gblq6\" (UniqueName: \"kubernetes.io/projected/04ed78b8-1a01-47f3-9d6e-9af3cbf62221-kube-api-access-gblq6\") pod \"redhat-marketplace-thjx8\" (UID: \"04ed78b8-1a01-47f3-9d6e-9af3cbf62221\") " pod="openshift-marketplace/redhat-marketplace-thjx8" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.015293 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04ed78b8-1a01-47f3-9d6e-9af3cbf62221-utilities\") pod \"redhat-marketplace-thjx8\" (UID: \"04ed78b8-1a01-47f3-9d6e-9af3cbf62221\") " pod="openshift-marketplace/redhat-marketplace-thjx8" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.015366 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04ed78b8-1a01-47f3-9d6e-9af3cbf62221-catalog-content\") pod \"redhat-marketplace-thjx8\" (UID: \"04ed78b8-1a01-47f3-9d6e-9af3cbf62221\") " pod="openshift-marketplace/redhat-marketplace-thjx8" Sep 30 13:41:35 crc kubenswrapper[4936]: E0930 13:41:35.015497 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:35.515482757 +0000 UTC m=+145.899485058 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.084784 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.085285 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.087019 4936 patch_prober.go:28] interesting pod/console-f9d7485db-jl85m container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.087069 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-jl85m" podUID="c7e5e231-b700-4151-81c8-111a3af3bfc2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.117464 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gblq6\" (UniqueName: \"kubernetes.io/projected/04ed78b8-1a01-47f3-9d6e-9af3cbf62221-kube-api-access-gblq6\") pod \"redhat-marketplace-thjx8\" (UID: \"04ed78b8-1a01-47f3-9d6e-9af3cbf62221\") " pod="openshift-marketplace/redhat-marketplace-thjx8" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.118161 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04ed78b8-1a01-47f3-9d6e-9af3cbf62221-utilities\") pod \"redhat-marketplace-thjx8\" (UID: \"04ed78b8-1a01-47f3-9d6e-9af3cbf62221\") " pod="openshift-marketplace/redhat-marketplace-thjx8" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.119565 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04ed78b8-1a01-47f3-9d6e-9af3cbf62221-utilities\") pod \"redhat-marketplace-thjx8\" (UID: \"04ed78b8-1a01-47f3-9d6e-9af3cbf62221\") " pod="openshift-marketplace/redhat-marketplace-thjx8" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.121408 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04ed78b8-1a01-47f3-9d6e-9af3cbf62221-catalog-content\") pod \"redhat-marketplace-thjx8\" (UID: \"04ed78b8-1a01-47f3-9d6e-9af3cbf62221\") " pod="openshift-marketplace/redhat-marketplace-thjx8" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.121800 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04ed78b8-1a01-47f3-9d6e-9af3cbf62221-catalog-content\") pod \"redhat-marketplace-thjx8\" (UID: \"04ed78b8-1a01-47f3-9d6e-9af3cbf62221\") " pod="openshift-marketplace/redhat-marketplace-thjx8" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.121854 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:35 crc kubenswrapper[4936]: E0930 13:41:35.122198 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:35.622188021 +0000 UTC m=+146.006190322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.146674 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gblq6\" (UniqueName: \"kubernetes.io/projected/04ed78b8-1a01-47f3-9d6e-9af3cbf62221-kube-api-access-gblq6\") pod \"redhat-marketplace-thjx8\" (UID: \"04ed78b8-1a01-47f3-9d6e-9af3cbf62221\") " pod="openshift-marketplace/redhat-marketplace-thjx8" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.223141 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:35 crc kubenswrapper[4936]: E0930 13:41:35.223351 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:35.7233133 +0000 UTC m=+146.107315601 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.223520 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:35 crc kubenswrapper[4936]: E0930 13:41:35.223946 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:35.723933479 +0000 UTC m=+146.107935780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.231524 4936 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfcbn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.231571 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-nfcbn" podUID="d50867fd-81e4-416d-a112-a84b175be026" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.231915 4936 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfcbn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.231933 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nfcbn" podUID="d50867fd-81e4-416d-a112-a84b175be026" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.245942 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-thjx8" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.314096 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l7pm4"] Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.315062 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7pm4" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.321172 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.324400 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:35 crc kubenswrapper[4936]: E0930 13:41:35.324706 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:35.824688537 +0000 UTC m=+146.208690838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.328447 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l7pm4"] Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.402865 4936 generic.go:334] "Generic (PLEG): container finished" podID="25d411f6-43fa-4ee8-a820-b5f77a3d94ac" containerID="9325eeeb8e24420eb1f6273dd5dadcd0ff0fcf014d1bf5960bf1f7e64aba2f13" exitCode=0 Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.403038 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6dxk" event={"ID":"25d411f6-43fa-4ee8-a820-b5f77a3d94ac","Type":"ContainerDied","Data":"9325eeeb8e24420eb1f6273dd5dadcd0ff0fcf014d1bf5960bf1f7e64aba2f13"} Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.403145 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6dxk" event={"ID":"25d411f6-43fa-4ee8-a820-b5f77a3d94ac","Type":"ContainerStarted","Data":"67c47c2b56cfe0dd95ad4c644890a2c2a852ac78e82261ba734416593fa8c223"} Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.408511 4936 generic.go:334] "Generic (PLEG): container finished" podID="f1141335-cd5a-4799-b931-d1e24f3d0044" containerID="e6f2b8a38e21c53aad1076d43c0af24e8b46f526a5a339bd88e128784bc0e96e" exitCode=0 Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.408595 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hl59l" event={"ID":"f1141335-cd5a-4799-b931-d1e24f3d0044","Type":"ContainerDied","Data":"e6f2b8a38e21c53aad1076d43c0af24e8b46f526a5a339bd88e128784bc0e96e"} Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.408615 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hl59l" event={"ID":"f1141335-cd5a-4799-b931-d1e24f3d0044","Type":"ContainerStarted","Data":"0af7359bc3df7f35686b62dde7572475883a453d9933091aef3e1d2f0cb5caf2"} Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.419990 4936 generic.go:334] "Generic (PLEG): container finished" podID="882bdf58-1728-423e-94c1-79c8dad7ab55" containerID="1f8f4bdf6e680ca5bd5398bea7897ecd021d429235c3c446efb5aa54e0bd89f9" exitCode=0 Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.420097 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh7ln" event={"ID":"882bdf58-1728-423e-94c1-79c8dad7ab55","Type":"ContainerDied","Data":"1f8f4bdf6e680ca5bd5398bea7897ecd021d429235c3c446efb5aa54e0bd89f9"} Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.426937 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.426975 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ebbf1be-a135-4aab-8f09-5d77fc1b1a60-catalog-content\") pod \"redhat-operators-l7pm4\" (UID: \"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60\") " pod="openshift-marketplace/redhat-operators-l7pm4" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.427006 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ebbf1be-a135-4aab-8f09-5d77fc1b1a60-utilities\") pod \"redhat-operators-l7pm4\" (UID: \"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60\") " pod="openshift-marketplace/redhat-operators-l7pm4" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.427058 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdfll\" (UniqueName: \"kubernetes.io/projected/4ebbf1be-a135-4aab-8f09-5d77fc1b1a60-kube-api-access-mdfll\") pod \"redhat-operators-l7pm4\" (UID: \"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60\") " pod="openshift-marketplace/redhat-operators-l7pm4" Sep 30 13:41:35 crc kubenswrapper[4936]: E0930 13:41:35.427287 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:35.92727199 +0000 UTC m=+146.311274291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.527803 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.528042 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ebbf1be-a135-4aab-8f09-5d77fc1b1a60-catalog-content\") pod \"redhat-operators-l7pm4\" (UID: \"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60\") " pod="openshift-marketplace/redhat-operators-l7pm4" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.528081 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ebbf1be-a135-4aab-8f09-5d77fc1b1a60-utilities\") pod \"redhat-operators-l7pm4\" (UID: \"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60\") " pod="openshift-marketplace/redhat-operators-l7pm4" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.528181 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdfll\" (UniqueName: \"kubernetes.io/projected/4ebbf1be-a135-4aab-8f09-5d77fc1b1a60-kube-api-access-mdfll\") pod \"redhat-operators-l7pm4\" (UID: \"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60\") " pod="openshift-marketplace/redhat-operators-l7pm4" Sep 30 13:41:35 crc kubenswrapper[4936]: E0930 13:41:35.528694 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:36.028679918 +0000 UTC m=+146.412682209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.529563 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ebbf1be-a135-4aab-8f09-5d77fc1b1a60-catalog-content\") pod \"redhat-operators-l7pm4\" (UID: \"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60\") " pod="openshift-marketplace/redhat-operators-l7pm4" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.529708 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tnh5g"] Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.529781 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ebbf1be-a135-4aab-8f09-5d77fc1b1a60-utilities\") pod \"redhat-operators-l7pm4\" (UID: \"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60\") " pod="openshift-marketplace/redhat-operators-l7pm4" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.537255 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnh5g" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.566176 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdfll\" (UniqueName: \"kubernetes.io/projected/4ebbf1be-a135-4aab-8f09-5d77fc1b1a60-kube-api-access-mdfll\") pod \"redhat-operators-l7pm4\" (UID: \"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60\") " pod="openshift-marketplace/redhat-operators-l7pm4" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.567001 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tnh5g"] Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.579080 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbjhk"] Sep 30 13:41:35 crc kubenswrapper[4936]: W0930 13:41:35.603724 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba953910_a090_4e61_88f2_f1718da34ce3.slice/crio-20b7b0a3320999900d861cb1fee1938352eeb80ca7790b8ff512dae76ea2e3d7 WatchSource:0}: Error finding container 20b7b0a3320999900d861cb1fee1938352eeb80ca7790b8ff512dae76ea2e3d7: Status 404 returned error can't find the container with id 20b7b0a3320999900d861cb1fee1938352eeb80ca7790b8ff512dae76ea2e3d7 Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.608893 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.608952 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-thjx8"] Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.610521 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.625010 4936 patch_prober.go:28] interesting pod/apiserver-76f77b778f-hj57l container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Sep 30 13:41:35 crc kubenswrapper[4936]: [+]log ok Sep 30 13:41:35 crc kubenswrapper[4936]: [+]etcd ok Sep 30 13:41:35 crc kubenswrapper[4936]: [+]poststarthook/start-apiserver-admission-initializer ok Sep 30 13:41:35 crc kubenswrapper[4936]: [+]poststarthook/generic-apiserver-start-informers ok Sep 30 13:41:35 crc kubenswrapper[4936]: [+]poststarthook/max-in-flight-filter ok Sep 30 13:41:35 crc kubenswrapper[4936]: [+]poststarthook/storage-object-count-tracker-hook ok Sep 30 13:41:35 crc kubenswrapper[4936]: [+]poststarthook/image.openshift.io-apiserver-caches ok Sep 30 13:41:35 crc kubenswrapper[4936]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Sep 30 13:41:35 crc kubenswrapper[4936]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Sep 30 13:41:35 crc kubenswrapper[4936]: [+]poststarthook/project.openshift.io-projectcache ok Sep 30 13:41:35 crc kubenswrapper[4936]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Sep 30 13:41:35 crc kubenswrapper[4936]: [+]poststarthook/openshift.io-startinformers ok Sep 30 13:41:35 crc kubenswrapper[4936]: [+]poststarthook/openshift.io-restmapperupdater ok Sep 30 13:41:35 crc kubenswrapper[4936]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Sep 30 13:41:35 crc kubenswrapper[4936]: livez check failed Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.625056 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-hj57l" podUID="a641ed11-580d-41ac-967c-e145d80b03fa" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.645173 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.645287 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7pm4" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.645173 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c24915c-38b8-4237-bf27-0202571c82ff-catalog-content\") pod \"redhat-operators-tnh5g\" (UID: \"4c24915c-38b8-4237-bf27-0202571c82ff\") " pod="openshift-marketplace/redhat-operators-tnh5g" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.645712 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c24915c-38b8-4237-bf27-0202571c82ff-utilities\") pod \"redhat-operators-tnh5g\" (UID: \"4c24915c-38b8-4237-bf27-0202571c82ff\") " pod="openshift-marketplace/redhat-operators-tnh5g" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.645766 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8b7c\" (UniqueName: \"kubernetes.io/projected/4c24915c-38b8-4237-bf27-0202571c82ff-kube-api-access-l8b7c\") pod \"redhat-operators-tnh5g\" (UID: \"4c24915c-38b8-4237-bf27-0202571c82ff\") " pod="openshift-marketplace/redhat-operators-tnh5g" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.645797 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:35 crc kubenswrapper[4936]: E0930 13:41:35.646068 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:36.146056405 +0000 UTC m=+146.530058706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.646785 4936 patch_prober.go:28] interesting pod/router-default-5444994796-sht9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:41:35 crc kubenswrapper[4936]: [-]has-synced failed: reason withheld Sep 30 13:41:35 crc kubenswrapper[4936]: [+]process-running ok Sep 30 13:41:35 crc kubenswrapper[4936]: healthz check failed Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.646817 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sht9l" podUID="2ea3912a-d1e4-4c08-81d1-3788ce6096e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.709725 4936 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.747078 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:35 crc kubenswrapper[4936]: E0930 13:41:35.747202 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:36.247185745 +0000 UTC m=+146.631188046 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.747363 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c24915c-38b8-4237-bf27-0202571c82ff-catalog-content\") pod \"redhat-operators-tnh5g\" (UID: \"4c24915c-38b8-4237-bf27-0202571c82ff\") " pod="openshift-marketplace/redhat-operators-tnh5g" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.747429 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c24915c-38b8-4237-bf27-0202571c82ff-utilities\") pod \"redhat-operators-tnh5g\" (UID: \"4c24915c-38b8-4237-bf27-0202571c82ff\") " pod="openshift-marketplace/redhat-operators-tnh5g" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.747479 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8b7c\" (UniqueName: \"kubernetes.io/projected/4c24915c-38b8-4237-bf27-0202571c82ff-kube-api-access-l8b7c\") pod \"redhat-operators-tnh5g\" (UID: \"4c24915c-38b8-4237-bf27-0202571c82ff\") " pod="openshift-marketplace/redhat-operators-tnh5g" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.747500 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:35 crc kubenswrapper[4936]: E0930 13:41:35.748769 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:36.248761401 +0000 UTC m=+146.632763702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.750182 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c24915c-38b8-4237-bf27-0202571c82ff-utilities\") pod \"redhat-operators-tnh5g\" (UID: \"4c24915c-38b8-4237-bf27-0202571c82ff\") " pod="openshift-marketplace/redhat-operators-tnh5g" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.750437 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c24915c-38b8-4237-bf27-0202571c82ff-catalog-content\") pod \"redhat-operators-tnh5g\" (UID: \"4c24915c-38b8-4237-bf27-0202571c82ff\") " pod="openshift-marketplace/redhat-operators-tnh5g" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.767927 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8b7c\" (UniqueName: \"kubernetes.io/projected/4c24915c-38b8-4237-bf27-0202571c82ff-kube-api-access-l8b7c\") pod \"redhat-operators-tnh5g\" (UID: \"4c24915c-38b8-4237-bf27-0202571c82ff\") " pod="openshift-marketplace/redhat-operators-tnh5g" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.848269 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:35 crc kubenswrapper[4936]: E0930 13:41:35.848446 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:36.348421268 +0000 UTC m=+146.732423569 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.848785 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:35 crc kubenswrapper[4936]: E0930 13:41:35.849078 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:36.349067137 +0000 UTC m=+146.733069438 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.865285 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnh5g" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.891608 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.891638 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.900615 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.903817 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.928227 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-6rwjr" Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.950359 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:35 crc kubenswrapper[4936]: E0930 13:41:35.952062 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:36.452046671 +0000 UTC m=+146.836048972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:35 crc kubenswrapper[4936]: I0930 13:41:35.986458 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l7pm4"] Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.053229 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:36 crc kubenswrapper[4936]: E0930 13:41:36.059199 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:36.559181917 +0000 UTC m=+146.943184358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.147575 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.148279 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.152685 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.152904 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.154093 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:36 crc kubenswrapper[4936]: E0930 13:41:36.154380 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:36.654366412 +0000 UTC m=+147.038368713 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.165414 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.256519 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2eec3e7e-81d2-4810-9fa1-5675cdfad48c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2eec3e7e-81d2-4810-9fa1-5675cdfad48c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.256932 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.256960 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2eec3e7e-81d2-4810-9fa1-5675cdfad48c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2eec3e7e-81d2-4810-9fa1-5675cdfad48c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 13:41:36 crc kubenswrapper[4936]: E0930 13:41:36.257319 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:36.757303275 +0000 UTC m=+147.141305576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.310214 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tnh5g"] Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.347320 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" Sep 30 13:41:36 crc kubenswrapper[4936]: E0930 13:41:36.358449 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:36.858401304 +0000 UTC m=+147.242403605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.376690 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.377187 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2eec3e7e-81d2-4810-9fa1-5675cdfad48c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2eec3e7e-81d2-4810-9fa1-5675cdfad48c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.377396 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.377428 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2eec3e7e-81d2-4810-9fa1-5675cdfad48c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2eec3e7e-81d2-4810-9fa1-5675cdfad48c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.377558 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2eec3e7e-81d2-4810-9fa1-5675cdfad48c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2eec3e7e-81d2-4810-9fa1-5675cdfad48c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 13:41:36 crc kubenswrapper[4936]: E0930 13:41:36.378089 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:36.878073802 +0000 UTC m=+147.262076103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.400166 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2eec3e7e-81d2-4810-9fa1-5675cdfad48c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2eec3e7e-81d2-4810-9fa1-5675cdfad48c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.460232 4936 generic.go:334] "Generic (PLEG): container finished" podID="661fcd49-29e9-4299-8fa7-9696bb5d1944" containerID="227d7785a4d3ba90022b984891dc38dd3c518d3b85c575898a13d368fb20d8e1" exitCode=0 Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.460321 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz" event={"ID":"661fcd49-29e9-4299-8fa7-9696bb5d1944","Type":"ContainerDied","Data":"227d7785a4d3ba90022b984891dc38dd3c518d3b85c575898a13d368fb20d8e1"} Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.476129 4936 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-09-30T13:41:35.709756366Z","Handler":null,"Name":""} Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.478617 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:36 crc kubenswrapper[4936]: E0930 13:41:36.478906 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:41:36.978864272 +0000 UTC m=+147.362866573 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.479054 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:36 crc kubenswrapper[4936]: E0930 13:41:36.479360 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 13:41:36.979348026 +0000 UTC m=+147.363350317 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4vgz6" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.480048 4936 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.480131 4936 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.480787 4936 generic.go:334] "Generic (PLEG): container finished" podID="4ebbf1be-a135-4aab-8f09-5d77fc1b1a60" containerID="fbf822e670a893042e4f185c7fd60e2d1ddff97589ab9dca1e4703408f15af05" exitCode=0 Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.480875 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7pm4" event={"ID":"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60","Type":"ContainerDied","Data":"fbf822e670a893042e4f185c7fd60e2d1ddff97589ab9dca1e4703408f15af05"} Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.480902 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7pm4" event={"ID":"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60","Type":"ContainerStarted","Data":"3fe5fc1711f845ec3c73a5381f923c47115864b2a31e5300eee3664d0981eaa7"} Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.484137 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnh5g" event={"ID":"4c24915c-38b8-4237-bf27-0202571c82ff","Type":"ContainerStarted","Data":"12ffed31f8f75c7e5ca570b9ad207a1c829df923249376c1b6fe093cdd0904a3"} Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.489655 4936 generic.go:334] "Generic (PLEG): container finished" podID="04ed78b8-1a01-47f3-9d6e-9af3cbf62221" containerID="5c12cc6fc175c198a81f253490668ad91aa143d4151b9e0d1765b7bd0a75d39b" exitCode=0 Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.489733 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thjx8" event={"ID":"04ed78b8-1a01-47f3-9d6e-9af3cbf62221","Type":"ContainerDied","Data":"5c12cc6fc175c198a81f253490668ad91aa143d4151b9e0d1765b7bd0a75d39b"} Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.489753 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thjx8" event={"ID":"04ed78b8-1a01-47f3-9d6e-9af3cbf62221","Type":"ContainerStarted","Data":"98d707a1113d25a80350c2949cbc539c5069062301b2651e31f31883f2c7c2a1"} Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.493326 4936 generic.go:334] "Generic (PLEG): container finished" podID="ba953910-a090-4e61-88f2-f1718da34ce3" containerID="b98da817a156fed52b39e0a27d247a657e21b1ad0323b749198918d6e84fde95" exitCode=0 Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.493434 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbjhk" event={"ID":"ba953910-a090-4e61-88f2-f1718da34ce3","Type":"ContainerDied","Data":"b98da817a156fed52b39e0a27d247a657e21b1ad0323b749198918d6e84fde95"} Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.493560 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbjhk" event={"ID":"ba953910-a090-4e61-88f2-f1718da34ce3","Type":"ContainerStarted","Data":"20b7b0a3320999900d861cb1fee1938352eeb80ca7790b8ff512dae76ea2e3d7"} Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.494980 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.506544 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-46l8f" Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.579853 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.611265 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.660658 4936 patch_prober.go:28] interesting pod/router-default-5444994796-sht9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:41:36 crc kubenswrapper[4936]: [-]has-synced failed: reason withheld Sep 30 13:41:36 crc kubenswrapper[4936]: [+]process-running ok Sep 30 13:41:36 crc kubenswrapper[4936]: healthz check failed Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.660711 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sht9l" podUID="2ea3912a-d1e4-4c08-81d1-3788ce6096e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.684999 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.688772 4936 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.688807 4936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:36 crc kubenswrapper[4936]: I0930 13:41:36.754708 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4vgz6\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:37 crc kubenswrapper[4936]: I0930 13:41:37.025285 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 13:41:37 crc kubenswrapper[4936]: I0930 13:41:37.045619 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:37 crc kubenswrapper[4936]: I0930 13:41:37.451013 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4vgz6"] Sep 30 13:41:37 crc kubenswrapper[4936]: I0930 13:41:37.563593 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2eec3e7e-81d2-4810-9fa1-5675cdfad48c","Type":"ContainerStarted","Data":"9648b16c4b58f1ead02948b5193d19975aef5bf5c9ea22a85e61dcfdb285b060"} Sep 30 13:41:37 crc kubenswrapper[4936]: I0930 13:41:37.570632 4936 generic.go:334] "Generic (PLEG): container finished" podID="4c24915c-38b8-4237-bf27-0202571c82ff" containerID="dda378069d956ccf63c17ed278aeab041178c63d20813a7dfeee9259a49589fc" exitCode=0 Sep 30 13:41:37 crc kubenswrapper[4936]: I0930 13:41:37.570693 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnh5g" event={"ID":"4c24915c-38b8-4237-bf27-0202571c82ff","Type":"ContainerDied","Data":"dda378069d956ccf63c17ed278aeab041178c63d20813a7dfeee9259a49589fc"} Sep 30 13:41:37 crc kubenswrapper[4936]: I0930 13:41:37.575034 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" event={"ID":"8bb902ae-877a-46b0-8972-2ea22f50782c","Type":"ContainerStarted","Data":"123dd1f2c206580979bdc33ca51d0e4ff5491a8822662efdfcc6e1ed2ef1004c"} Sep 30 13:41:37 crc kubenswrapper[4936]: I0930 13:41:37.641366 4936 patch_prober.go:28] interesting pod/router-default-5444994796-sht9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:41:37 crc kubenswrapper[4936]: [-]has-synced failed: reason withheld Sep 30 13:41:37 crc kubenswrapper[4936]: [+]process-running ok Sep 30 13:41:37 crc kubenswrapper[4936]: healthz check failed Sep 30 13:41:37 crc kubenswrapper[4936]: I0930 13:41:37.641425 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sht9l" podUID="2ea3912a-d1e4-4c08-81d1-3788ce6096e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:41:37 crc kubenswrapper[4936]: I0930 13:41:37.863899 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.003432 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/661fcd49-29e9-4299-8fa7-9696bb5d1944-secret-volume\") pod \"661fcd49-29e9-4299-8fa7-9696bb5d1944\" (UID: \"661fcd49-29e9-4299-8fa7-9696bb5d1944\") " Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.003508 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/661fcd49-29e9-4299-8fa7-9696bb5d1944-config-volume\") pod \"661fcd49-29e9-4299-8fa7-9696bb5d1944\" (UID: \"661fcd49-29e9-4299-8fa7-9696bb5d1944\") " Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.003553 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp5vl\" (UniqueName: \"kubernetes.io/projected/661fcd49-29e9-4299-8fa7-9696bb5d1944-kube-api-access-dp5vl\") pod \"661fcd49-29e9-4299-8fa7-9696bb5d1944\" (UID: \"661fcd49-29e9-4299-8fa7-9696bb5d1944\") " Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.004287 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/661fcd49-29e9-4299-8fa7-9696bb5d1944-config-volume" (OuterVolumeSpecName: "config-volume") pod "661fcd49-29e9-4299-8fa7-9696bb5d1944" (UID: "661fcd49-29e9-4299-8fa7-9696bb5d1944"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.013575 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/661fcd49-29e9-4299-8fa7-9696bb5d1944-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "661fcd49-29e9-4299-8fa7-9696bb5d1944" (UID: "661fcd49-29e9-4299-8fa7-9696bb5d1944"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.017362 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/661fcd49-29e9-4299-8fa7-9696bb5d1944-kube-api-access-dp5vl" (OuterVolumeSpecName: "kube-api-access-dp5vl") pod "661fcd49-29e9-4299-8fa7-9696bb5d1944" (UID: "661fcd49-29e9-4299-8fa7-9696bb5d1944"). InnerVolumeSpecName "kube-api-access-dp5vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.105144 4936 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/661fcd49-29e9-4299-8fa7-9696bb5d1944-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.105175 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp5vl\" (UniqueName: \"kubernetes.io/projected/661fcd49-29e9-4299-8fa7-9696bb5d1944-kube-api-access-dp5vl\") on node \"crc\" DevicePath \"\"" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.105186 4936 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/661fcd49-29e9-4299-8fa7-9696bb5d1944-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.337446 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.580233 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 13:41:38 crc kubenswrapper[4936]: E0930 13:41:38.580543 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="661fcd49-29e9-4299-8fa7-9696bb5d1944" containerName="collect-profiles" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.580556 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="661fcd49-29e9-4299-8fa7-9696bb5d1944" containerName="collect-profiles" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.580652 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="661fcd49-29e9-4299-8fa7-9696bb5d1944" containerName="collect-profiles" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.581112 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.582434 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.582793 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.586420 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.633961 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" event={"ID":"8bb902ae-877a-46b0-8972-2ea22f50782c","Type":"ContainerStarted","Data":"ea4c1b34360525507890a77633eeed3602fcbb4670ea0e749024f5a37b70be3e"} Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.634049 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.637836 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.637915 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz" event={"ID":"661fcd49-29e9-4299-8fa7-9696bb5d1944","Type":"ContainerDied","Data":"895a3deb2b4f844a017c4721794d0283798f2f3c0f595ed6afba58ee9201bf9b"} Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.637951 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="895a3deb2b4f844a017c4721794d0283798f2f3c0f595ed6afba58ee9201bf9b" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.641047 4936 patch_prober.go:28] interesting pod/router-default-5444994796-sht9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:41:38 crc kubenswrapper[4936]: [-]has-synced failed: reason withheld Sep 30 13:41:38 crc kubenswrapper[4936]: [+]process-running ok Sep 30 13:41:38 crc kubenswrapper[4936]: healthz check failed Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.641095 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sht9l" podUID="2ea3912a-d1e4-4c08-81d1-3788ce6096e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.651465 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2eec3e7e-81d2-4810-9fa1-5675cdfad48c","Type":"ContainerStarted","Data":"b87c0900f4573817fb937841d668a7ea08f4f08019798d5dd3c06f0a832fe875"} Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.668367 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" podStartSLOduration=128.668244076 podStartE2EDuration="2m8.668244076s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:38.659633923 +0000 UTC m=+149.043636224" watchObservedRunningTime="2025-09-30 13:41:38.668244076 +0000 UTC m=+149.052246377" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.675145 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.675116758 podStartE2EDuration="2.675116758s" podCreationTimestamp="2025-09-30 13:41:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:38.669509313 +0000 UTC m=+149.053511614" watchObservedRunningTime="2025-09-30 13:41:38.675116758 +0000 UTC m=+149.059119079" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.717779 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1f30cc6-6872-4e2c-a751-dfd2a96202c4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c1f30cc6-6872-4e2c-a751-dfd2a96202c4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.717844 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1f30cc6-6872-4e2c-a751-dfd2a96202c4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c1f30cc6-6872-4e2c-a751-dfd2a96202c4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.820290 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1f30cc6-6872-4e2c-a751-dfd2a96202c4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c1f30cc6-6872-4e2c-a751-dfd2a96202c4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.820654 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1f30cc6-6872-4e2c-a751-dfd2a96202c4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c1f30cc6-6872-4e2c-a751-dfd2a96202c4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.821899 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1f30cc6-6872-4e2c-a751-dfd2a96202c4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c1f30cc6-6872-4e2c-a751-dfd2a96202c4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.837927 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1f30cc6-6872-4e2c-a751-dfd2a96202c4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c1f30cc6-6872-4e2c-a751-dfd2a96202c4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 13:41:38 crc kubenswrapper[4936]: I0930 13:41:38.930956 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 13:41:39 crc kubenswrapper[4936]: I0930 13:41:39.025010 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:41:39 crc kubenswrapper[4936]: I0930 13:41:39.025068 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:41:39 crc kubenswrapper[4936]: I0930 13:41:39.025097 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:41:39 crc kubenswrapper[4936]: I0930 13:41:39.025118 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:41:39 crc kubenswrapper[4936]: I0930 13:41:39.028726 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:41:39 crc kubenswrapper[4936]: I0930 13:41:39.028846 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:41:39 crc kubenswrapper[4936]: I0930 13:41:39.029395 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:41:39 crc kubenswrapper[4936]: I0930 13:41:39.041160 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:41:39 crc kubenswrapper[4936]: I0930 13:41:39.060798 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:41:39 crc kubenswrapper[4936]: I0930 13:41:39.078183 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:41:39 crc kubenswrapper[4936]: I0930 13:41:39.351666 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:41:39 crc kubenswrapper[4936]: I0930 13:41:39.476728 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 13:41:39 crc kubenswrapper[4936]: W0930 13:41:39.489149 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-270e68851c82619a3b8b757389a006be65968926cc0c09844151a1b17c584d6e WatchSource:0}: Error finding container 270e68851c82619a3b8b757389a006be65968926cc0c09844151a1b17c584d6e: Status 404 returned error can't find the container with id 270e68851c82619a3b8b757389a006be65968926cc0c09844151a1b17c584d6e Sep 30 13:41:39 crc kubenswrapper[4936]: I0930 13:41:39.641696 4936 patch_prober.go:28] interesting pod/router-default-5444994796-sht9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:41:39 crc kubenswrapper[4936]: [-]has-synced failed: reason withheld Sep 30 13:41:39 crc kubenswrapper[4936]: [+]process-running ok Sep 30 13:41:39 crc kubenswrapper[4936]: healthz check failed Sep 30 13:41:39 crc kubenswrapper[4936]: I0930 13:41:39.641993 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sht9l" podUID="2ea3912a-d1e4-4c08-81d1-3788ce6096e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:41:39 crc kubenswrapper[4936]: I0930 13:41:39.673600 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"99a1ba1ded3fb285cb29cbb0d1ef73a670907fa4c594f6b7a9c2b953ad1d82db"} Sep 30 13:41:39 crc kubenswrapper[4936]: I0930 13:41:39.680767 4936 generic.go:334] "Generic (PLEG): container finished" podID="2eec3e7e-81d2-4810-9fa1-5675cdfad48c" containerID="b87c0900f4573817fb937841d668a7ea08f4f08019798d5dd3c06f0a832fe875" exitCode=0 Sep 30 13:41:39 crc kubenswrapper[4936]: I0930 13:41:39.681106 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2eec3e7e-81d2-4810-9fa1-5675cdfad48c","Type":"ContainerDied","Data":"b87c0900f4573817fb937841d668a7ea08f4f08019798d5dd3c06f0a832fe875"} Sep 30 13:41:39 crc kubenswrapper[4936]: I0930 13:41:39.699712 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c1f30cc6-6872-4e2c-a751-dfd2a96202c4","Type":"ContainerStarted","Data":"1c1e7a9a8f63f354c6fd057692cfe0d74136aaba150c8177eb5892ba1d6be59c"} Sep 30 13:41:39 crc kubenswrapper[4936]: I0930 13:41:39.717260 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"270e68851c82619a3b8b757389a006be65968926cc0c09844151a1b17c584d6e"} Sep 30 13:41:40 crc kubenswrapper[4936]: I0930 13:41:40.623975 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:40 crc kubenswrapper[4936]: I0930 13:41:40.630374 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-hj57l" Sep 30 13:41:40 crc kubenswrapper[4936]: I0930 13:41:40.648281 4936 patch_prober.go:28] interesting pod/router-default-5444994796-sht9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:41:40 crc kubenswrapper[4936]: [-]has-synced failed: reason withheld Sep 30 13:41:40 crc kubenswrapper[4936]: [+]process-running ok Sep 30 13:41:40 crc kubenswrapper[4936]: healthz check failed Sep 30 13:41:40 crc kubenswrapper[4936]: I0930 13:41:40.648322 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sht9l" podUID="2ea3912a-d1e4-4c08-81d1-3788ce6096e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:41:40 crc kubenswrapper[4936]: I0930 13:41:40.791079 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4fc068b709c0d436f5c82b1db81bcf7e77e8ae2c7305e453bdf7ca07c223f64b"} Sep 30 13:41:40 crc kubenswrapper[4936]: I0930 13:41:40.792170 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:41:40 crc kubenswrapper[4936]: I0930 13:41:40.866650 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c1f30cc6-6872-4e2c-a751-dfd2a96202c4","Type":"ContainerStarted","Data":"27fd54927569c93c57892371283b033233bd704e78180a84445fc44864165d8f"} Sep 30 13:41:40 crc kubenswrapper[4936]: I0930 13:41:40.875539 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"725451aa8cd711f0ff709f30012533b3ac319ba6640beda2428552b0eaf13d53"} Sep 30 13:41:40 crc kubenswrapper[4936]: I0930 13:41:40.875581 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"907b7cc5e982ba49be59ec052de422aecc0226cb099c8a5bb72ba2a8d33a1970"} Sep 30 13:41:40 crc kubenswrapper[4936]: I0930 13:41:40.914773 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.91475594 podStartE2EDuration="2.91475594s" podCreationTimestamp="2025-09-30 13:41:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:41:40.888236491 +0000 UTC m=+151.272238792" watchObservedRunningTime="2025-09-30 13:41:40.91475594 +0000 UTC m=+151.298758241" Sep 30 13:41:40 crc kubenswrapper[4936]: I0930 13:41:40.973369 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"59dff9b42a25bd90b13f9ccc9252974dbed0d95ff3ba4b7a8e51b6d8bcbcf2b2"} Sep 30 13:41:41 crc kubenswrapper[4936]: E0930 13:41:41.368391 4936 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podc1f30cc6_6872_4e2c_a751_dfd2a96202c4.slice/crio-conmon-27fd54927569c93c57892371283b033233bd704e78180a84445fc44864165d8f.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:41:41 crc kubenswrapper[4936]: I0930 13:41:41.423972 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 13:41:41 crc kubenswrapper[4936]: I0930 13:41:41.424375 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bjbnj" Sep 30 13:41:41 crc kubenswrapper[4936]: I0930 13:41:41.593990 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2eec3e7e-81d2-4810-9fa1-5675cdfad48c-kube-api-access\") pod \"2eec3e7e-81d2-4810-9fa1-5675cdfad48c\" (UID: \"2eec3e7e-81d2-4810-9fa1-5675cdfad48c\") " Sep 30 13:41:41 crc kubenswrapper[4936]: I0930 13:41:41.594110 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2eec3e7e-81d2-4810-9fa1-5675cdfad48c-kubelet-dir\") pod \"2eec3e7e-81d2-4810-9fa1-5675cdfad48c\" (UID: \"2eec3e7e-81d2-4810-9fa1-5675cdfad48c\") " Sep 30 13:41:41 crc kubenswrapper[4936]: I0930 13:41:41.595670 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2eec3e7e-81d2-4810-9fa1-5675cdfad48c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2eec3e7e-81d2-4810-9fa1-5675cdfad48c" (UID: "2eec3e7e-81d2-4810-9fa1-5675cdfad48c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:41:41 crc kubenswrapper[4936]: I0930 13:41:41.601895 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eec3e7e-81d2-4810-9fa1-5675cdfad48c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2eec3e7e-81d2-4810-9fa1-5675cdfad48c" (UID: "2eec3e7e-81d2-4810-9fa1-5675cdfad48c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:41:41 crc kubenswrapper[4936]: I0930 13:41:41.653666 4936 patch_prober.go:28] interesting pod/router-default-5444994796-sht9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:41:41 crc kubenswrapper[4936]: [-]has-synced failed: reason withheld Sep 30 13:41:41 crc kubenswrapper[4936]: [+]process-running ok Sep 30 13:41:41 crc kubenswrapper[4936]: healthz check failed Sep 30 13:41:41 crc kubenswrapper[4936]: I0930 13:41:41.653736 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sht9l" podUID="2ea3912a-d1e4-4c08-81d1-3788ce6096e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:41:41 crc kubenswrapper[4936]: I0930 13:41:41.695285 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2eec3e7e-81d2-4810-9fa1-5675cdfad48c-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 13:41:41 crc kubenswrapper[4936]: I0930 13:41:41.695319 4936 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2eec3e7e-81d2-4810-9fa1-5675cdfad48c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 13:41:42 crc kubenswrapper[4936]: I0930 13:41:42.002839 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 13:41:42 crc kubenswrapper[4936]: I0930 13:41:42.002853 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2eec3e7e-81d2-4810-9fa1-5675cdfad48c","Type":"ContainerDied","Data":"9648b16c4b58f1ead02948b5193d19975aef5bf5c9ea22a85e61dcfdb285b060"} Sep 30 13:41:42 crc kubenswrapper[4936]: I0930 13:41:42.003197 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9648b16c4b58f1ead02948b5193d19975aef5bf5c9ea22a85e61dcfdb285b060" Sep 30 13:41:42 crc kubenswrapper[4936]: I0930 13:41:42.010028 4936 generic.go:334] "Generic (PLEG): container finished" podID="c1f30cc6-6872-4e2c-a751-dfd2a96202c4" containerID="27fd54927569c93c57892371283b033233bd704e78180a84445fc44864165d8f" exitCode=0 Sep 30 13:41:42 crc kubenswrapper[4936]: I0930 13:41:42.010832 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c1f30cc6-6872-4e2c-a751-dfd2a96202c4","Type":"ContainerDied","Data":"27fd54927569c93c57892371283b033233bd704e78180a84445fc44864165d8f"} Sep 30 13:41:42 crc kubenswrapper[4936]: I0930 13:41:42.642803 4936 patch_prober.go:28] interesting pod/router-default-5444994796-sht9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:41:42 crc kubenswrapper[4936]: [-]has-synced failed: reason withheld Sep 30 13:41:42 crc kubenswrapper[4936]: [+]process-running ok Sep 30 13:41:42 crc kubenswrapper[4936]: healthz check failed Sep 30 13:41:42 crc kubenswrapper[4936]: I0930 13:41:42.642853 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sht9l" podUID="2ea3912a-d1e4-4c08-81d1-3788ce6096e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:41:43 crc kubenswrapper[4936]: I0930 13:41:43.333612 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 13:41:43 crc kubenswrapper[4936]: I0930 13:41:43.434447 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1f30cc6-6872-4e2c-a751-dfd2a96202c4-kubelet-dir\") pod \"c1f30cc6-6872-4e2c-a751-dfd2a96202c4\" (UID: \"c1f30cc6-6872-4e2c-a751-dfd2a96202c4\") " Sep 30 13:41:43 crc kubenswrapper[4936]: I0930 13:41:43.434552 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1f30cc6-6872-4e2c-a751-dfd2a96202c4-kube-api-access\") pod \"c1f30cc6-6872-4e2c-a751-dfd2a96202c4\" (UID: \"c1f30cc6-6872-4e2c-a751-dfd2a96202c4\") " Sep 30 13:41:43 crc kubenswrapper[4936]: I0930 13:41:43.434668 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c1f30cc6-6872-4e2c-a751-dfd2a96202c4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c1f30cc6-6872-4e2c-a751-dfd2a96202c4" (UID: "c1f30cc6-6872-4e2c-a751-dfd2a96202c4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:41:43 crc kubenswrapper[4936]: I0930 13:41:43.436003 4936 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1f30cc6-6872-4e2c-a751-dfd2a96202c4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 13:41:43 crc kubenswrapper[4936]: I0930 13:41:43.444539 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1f30cc6-6872-4e2c-a751-dfd2a96202c4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c1f30cc6-6872-4e2c-a751-dfd2a96202c4" (UID: "c1f30cc6-6872-4e2c-a751-dfd2a96202c4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:41:43 crc kubenswrapper[4936]: I0930 13:41:43.537553 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1f30cc6-6872-4e2c-a751-dfd2a96202c4-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 13:41:43 crc kubenswrapper[4936]: I0930 13:41:43.645290 4936 patch_prober.go:28] interesting pod/router-default-5444994796-sht9l container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 13:41:43 crc kubenswrapper[4936]: [+]has-synced ok Sep 30 13:41:43 crc kubenswrapper[4936]: [+]process-running ok Sep 30 13:41:43 crc kubenswrapper[4936]: healthz check failed Sep 30 13:41:43 crc kubenswrapper[4936]: I0930 13:41:43.645363 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sht9l" podUID="2ea3912a-d1e4-4c08-81d1-3788ce6096e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:41:44 crc kubenswrapper[4936]: I0930 13:41:44.051468 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c1f30cc6-6872-4e2c-a751-dfd2a96202c4","Type":"ContainerDied","Data":"1c1e7a9a8f63f354c6fd057692cfe0d74136aaba150c8177eb5892ba1d6be59c"} Sep 30 13:41:44 crc kubenswrapper[4936]: I0930 13:41:44.051512 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c1e7a9a8f63f354c6fd057692cfe0d74136aaba150c8177eb5892ba1d6be59c" Sep 30 13:41:44 crc kubenswrapper[4936]: I0930 13:41:44.051600 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 13:41:44 crc kubenswrapper[4936]: I0930 13:41:44.643237 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:44 crc kubenswrapper[4936]: I0930 13:41:44.647711 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-sht9l" Sep 30 13:41:45 crc kubenswrapper[4936]: I0930 13:41:45.085494 4936 patch_prober.go:28] interesting pod/console-f9d7485db-jl85m container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Sep 30 13:41:45 crc kubenswrapper[4936]: I0930 13:41:45.085543 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-jl85m" podUID="c7e5e231-b700-4151-81c8-111a3af3bfc2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Sep 30 13:41:45 crc kubenswrapper[4936]: I0930 13:41:45.232094 4936 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfcbn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Sep 30 13:41:45 crc kubenswrapper[4936]: I0930 13:41:45.232155 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nfcbn" podUID="d50867fd-81e4-416d-a112-a84b175be026" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Sep 30 13:41:45 crc kubenswrapper[4936]: I0930 13:41:45.232107 4936 patch_prober.go:28] interesting pod/downloads-7954f5f757-nfcbn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Sep 30 13:41:45 crc kubenswrapper[4936]: I0930 13:41:45.232254 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-nfcbn" podUID="d50867fd-81e4-416d-a112-a84b175be026" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Sep 30 13:41:48 crc kubenswrapper[4936]: I0930 13:41:48.250781 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:41:48 crc kubenswrapper[4936]: I0930 13:41:48.251201 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:41:52 crc kubenswrapper[4936]: I0930 13:41:52.137990 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs\") pod \"network-metrics-daemon-2v46m\" (UID: \"e3bd8048-3efa-41ed-a7ff-8d477db72be7\") " pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:41:52 crc kubenswrapper[4936]: I0930 13:41:52.145570 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bd8048-3efa-41ed-a7ff-8d477db72be7-metrics-certs\") pod \"network-metrics-daemon-2v46m\" (UID: \"e3bd8048-3efa-41ed-a7ff-8d477db72be7\") " pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:41:52 crc kubenswrapper[4936]: I0930 13:41:52.329479 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v46m" Sep 30 13:41:55 crc kubenswrapper[4936]: I0930 13:41:55.029765 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2v46m"] Sep 30 13:41:55 crc kubenswrapper[4936]: I0930 13:41:55.089197 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:55 crc kubenswrapper[4936]: I0930 13:41:55.093127 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:41:55 crc kubenswrapper[4936]: I0930 13:41:55.173523 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2v46m" event={"ID":"e3bd8048-3efa-41ed-a7ff-8d477db72be7","Type":"ContainerStarted","Data":"9e376975799d708cc7aad468908f20bd654cffd5b12f0967cdb21071c235826f"} Sep 30 13:41:55 crc kubenswrapper[4936]: I0930 13:41:55.236841 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-nfcbn" Sep 30 13:41:57 crc kubenswrapper[4936]: I0930 13:41:57.054388 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:41:57 crc kubenswrapper[4936]: I0930 13:41:57.187061 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2v46m" event={"ID":"e3bd8048-3efa-41ed-a7ff-8d477db72be7","Type":"ContainerStarted","Data":"ff4cb48baa64cf7e0d183ea61ddb66b2bac261de40839cfe2dd8e62fc229e79b"} Sep 30 13:42:05 crc kubenswrapper[4936]: E0930 13:42:05.599657 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Sep 30 13:42:05 crc kubenswrapper[4936]: E0930 13:42:05.600364 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n9tmr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hsdwj_openshift-marketplace(eea297dc-9577-48fb-b7b4-e1731a2d44d7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 13:42:05 crc kubenswrapper[4936]: E0930 13:42:05.601536 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hsdwj" podUID="eea297dc-9577-48fb-b7b4-e1731a2d44d7" Sep 30 13:42:06 crc kubenswrapper[4936]: I0930 13:42:06.355716 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v22qc" Sep 30 13:42:07 crc kubenswrapper[4936]: E0930 13:42:07.895704 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hsdwj" podUID="eea297dc-9577-48fb-b7b4-e1731a2d44d7" Sep 30 13:42:09 crc kubenswrapper[4936]: I0930 13:42:09.062358 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:42:15 crc kubenswrapper[4936]: E0930 13:42:15.922566 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Sep 30 13:42:15 crc kubenswrapper[4936]: E0930 13:42:15.923307 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mdfll,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-l7pm4_openshift-marketplace(4ebbf1be-a135-4aab-8f09-5d77fc1b1a60): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 13:42:15 crc kubenswrapper[4936]: E0930 13:42:15.924469 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-l7pm4" podUID="4ebbf1be-a135-4aab-8f09-5d77fc1b1a60" Sep 30 13:42:16 crc kubenswrapper[4936]: E0930 13:42:16.766272 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-l7pm4" podUID="4ebbf1be-a135-4aab-8f09-5d77fc1b1a60" Sep 30 13:42:17 crc kubenswrapper[4936]: E0930 13:42:17.363442 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Sep 30 13:42:17 crc kubenswrapper[4936]: E0930 13:42:17.363850 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l8b7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-tnh5g_openshift-marketplace(4c24915c-38b8-4237-bf27-0202571c82ff): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 13:42:17 crc kubenswrapper[4936]: E0930 13:42:17.364957 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-tnh5g" podUID="4c24915c-38b8-4237-bf27-0202571c82ff" Sep 30 13:42:17 crc kubenswrapper[4936]: E0930 13:42:17.790635 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Sep 30 13:42:17 crc kubenswrapper[4936]: E0930 13:42:17.791164 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rxn4d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-sbjhk_openshift-marketplace(ba953910-a090-4e61-88f2-f1718da34ce3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 13:42:17 crc kubenswrapper[4936]: E0930 13:42:17.795296 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-sbjhk" podUID="ba953910-a090-4e61-88f2-f1718da34ce3" Sep 30 13:42:18 crc kubenswrapper[4936]: I0930 13:42:18.250256 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:42:18 crc kubenswrapper[4936]: I0930 13:42:18.250321 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:42:18 crc kubenswrapper[4936]: I0930 13:42:18.316633 4936 generic.go:334] "Generic (PLEG): container finished" podID="f1141335-cd5a-4799-b931-d1e24f3d0044" containerID="daef17f407b2e380c8cec92dbe60cb72e2c1be5dc263f37d17a6b5678d0261fd" exitCode=0 Sep 30 13:42:18 crc kubenswrapper[4936]: I0930 13:42:18.321219 4936 generic.go:334] "Generic (PLEG): container finished" podID="04ed78b8-1a01-47f3-9d6e-9af3cbf62221" containerID="f18d4c2728942a7e65a12b7f96d23ab8dbc5312a4ed604b5c1cb15f8e9751758" exitCode=0 Sep 30 13:42:18 crc kubenswrapper[4936]: I0930 13:42:18.322073 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hl59l" event={"ID":"f1141335-cd5a-4799-b931-d1e24f3d0044","Type":"ContainerDied","Data":"daef17f407b2e380c8cec92dbe60cb72e2c1be5dc263f37d17a6b5678d0261fd"} Sep 30 13:42:18 crc kubenswrapper[4936]: I0930 13:42:18.322108 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thjx8" event={"ID":"04ed78b8-1a01-47f3-9d6e-9af3cbf62221","Type":"ContainerDied","Data":"f18d4c2728942a7e65a12b7f96d23ab8dbc5312a4ed604b5c1cb15f8e9751758"} Sep 30 13:42:18 crc kubenswrapper[4936]: I0930 13:42:18.323037 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh7ln" event={"ID":"882bdf58-1728-423e-94c1-79c8dad7ab55","Type":"ContainerStarted","Data":"64ab4bb6485a3f2755e7b634cbb203538c0e2e8e27172e1845ff711663b2cdca"} Sep 30 13:42:18 crc kubenswrapper[4936]: I0930 13:42:18.324882 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2v46m" event={"ID":"e3bd8048-3efa-41ed-a7ff-8d477db72be7","Type":"ContainerStarted","Data":"5cd11982e6e41f47fc27964f542514e0c4b56c79fac52c2825766847b7da6bea"} Sep 30 13:42:18 crc kubenswrapper[4936]: E0930 13:42:18.326113 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sbjhk" podUID="ba953910-a090-4e61-88f2-f1718da34ce3" Sep 30 13:42:18 crc kubenswrapper[4936]: E0930 13:42:18.326420 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-tnh5g" podUID="4c24915c-38b8-4237-bf27-0202571c82ff" Sep 30 13:42:18 crc kubenswrapper[4936]: I0930 13:42:18.385416 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2v46m" podStartSLOduration=168.385399565 podStartE2EDuration="2m48.385399565s" podCreationTimestamp="2025-09-30 13:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:42:18.379635236 +0000 UTC m=+188.763637537" watchObservedRunningTime="2025-09-30 13:42:18.385399565 +0000 UTC m=+188.769401866" Sep 30 13:42:19 crc kubenswrapper[4936]: I0930 13:42:19.331387 4936 generic.go:334] "Generic (PLEG): container finished" podID="882bdf58-1728-423e-94c1-79c8dad7ab55" containerID="64ab4bb6485a3f2755e7b634cbb203538c0e2e8e27172e1845ff711663b2cdca" exitCode=0 Sep 30 13:42:19 crc kubenswrapper[4936]: I0930 13:42:19.331436 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh7ln" event={"ID":"882bdf58-1728-423e-94c1-79c8dad7ab55","Type":"ContainerDied","Data":"64ab4bb6485a3f2755e7b634cbb203538c0e2e8e27172e1845ff711663b2cdca"} Sep 30 13:42:19 crc kubenswrapper[4936]: E0930 13:42:19.772626 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Sep 30 13:42:19 crc kubenswrapper[4936]: E0930 13:42:19.772841 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xc486,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-m6dxk_openshift-marketplace(25d411f6-43fa-4ee8-a820-b5f77a3d94ac): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 13:42:19 crc kubenswrapper[4936]: E0930 13:42:19.774555 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-m6dxk" podUID="25d411f6-43fa-4ee8-a820-b5f77a3d94ac" Sep 30 13:42:20 crc kubenswrapper[4936]: E0930 13:42:20.345997 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-m6dxk" podUID="25d411f6-43fa-4ee8-a820-b5f77a3d94ac" Sep 30 13:42:25 crc kubenswrapper[4936]: I0930 13:42:25.365248 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thjx8" event={"ID":"04ed78b8-1a01-47f3-9d6e-9af3cbf62221","Type":"ContainerStarted","Data":"abec549b847faed74a62d7a6c39c4f0ee1e4af0c2867c86875de0d230adcfc71"} Sep 30 13:42:26 crc kubenswrapper[4936]: I0930 13:42:26.372063 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hl59l" event={"ID":"f1141335-cd5a-4799-b931-d1e24f3d0044","Type":"ContainerStarted","Data":"b250838fcb241c4454be4b60285c6a7e7ff6a9e3d7aed71bfb1ef0696e1be744"} Sep 30 13:42:26 crc kubenswrapper[4936]: I0930 13:42:26.374646 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh7ln" event={"ID":"882bdf58-1728-423e-94c1-79c8dad7ab55","Type":"ContainerStarted","Data":"7c610cdc270397500eef1f0de0154fdd812dbf3da2bfde3e264456ff3106e81b"} Sep 30 13:42:26 crc kubenswrapper[4936]: I0930 13:42:26.376930 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsdwj" event={"ID":"eea297dc-9577-48fb-b7b4-e1731a2d44d7","Type":"ContainerStarted","Data":"bd8f1fcc3d6501322d0c4b13ff63fa918ac442d6ef2b956dc123069785fcfb67"} Sep 30 13:42:26 crc kubenswrapper[4936]: I0930 13:42:26.398480 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hl59l" podStartSLOduration=3.813317847 podStartE2EDuration="54.398460741s" podCreationTimestamp="2025-09-30 13:41:32 +0000 UTC" firstStartedPulling="2025-09-30 13:41:35.409726845 +0000 UTC m=+145.793729146" lastFinishedPulling="2025-09-30 13:42:25.994869739 +0000 UTC m=+196.378872040" observedRunningTime="2025-09-30 13:42:26.393945758 +0000 UTC m=+196.777948069" watchObservedRunningTime="2025-09-30 13:42:26.398460741 +0000 UTC m=+196.782463052" Sep 30 13:42:26 crc kubenswrapper[4936]: I0930 13:42:26.434589 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rh7ln" podStartSLOduration=3.809709181 podStartE2EDuration="54.434567411s" podCreationTimestamp="2025-09-30 13:41:32 +0000 UTC" firstStartedPulling="2025-09-30 13:41:35.421886502 +0000 UTC m=+145.805898533" lastFinishedPulling="2025-09-30 13:42:26.046754462 +0000 UTC m=+196.430756763" observedRunningTime="2025-09-30 13:42:26.430198153 +0000 UTC m=+196.814200464" watchObservedRunningTime="2025-09-30 13:42:26.434567411 +0000 UTC m=+196.818569722" Sep 30 13:42:26 crc kubenswrapper[4936]: I0930 13:42:26.458767 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-thjx8" podStartSLOduration=4.472419385 podStartE2EDuration="52.458747141s" podCreationTimestamp="2025-09-30 13:41:34 +0000 UTC" firstStartedPulling="2025-09-30 13:41:36.492516043 +0000 UTC m=+146.876518334" lastFinishedPulling="2025-09-30 13:42:24.478843779 +0000 UTC m=+194.862846090" observedRunningTime="2025-09-30 13:42:26.457752032 +0000 UTC m=+196.841754353" watchObservedRunningTime="2025-09-30 13:42:26.458747141 +0000 UTC m=+196.842749442" Sep 30 13:42:27 crc kubenswrapper[4936]: I0930 13:42:27.383532 4936 generic.go:334] "Generic (PLEG): container finished" podID="eea297dc-9577-48fb-b7b4-e1731a2d44d7" containerID="bd8f1fcc3d6501322d0c4b13ff63fa918ac442d6ef2b956dc123069785fcfb67" exitCode=0 Sep 30 13:42:27 crc kubenswrapper[4936]: I0930 13:42:27.385223 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsdwj" event={"ID":"eea297dc-9577-48fb-b7b4-e1731a2d44d7","Type":"ContainerDied","Data":"bd8f1fcc3d6501322d0c4b13ff63fa918ac442d6ef2b956dc123069785fcfb67"} Sep 30 13:42:28 crc kubenswrapper[4936]: I0930 13:42:28.393611 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsdwj" event={"ID":"eea297dc-9577-48fb-b7b4-e1731a2d44d7","Type":"ContainerStarted","Data":"fd114c3e587a0fd263ff933164e6e700bc0ca89b5bf33d8428a2850f57f78d83"} Sep 30 13:42:28 crc kubenswrapper[4936]: I0930 13:42:28.412398 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hsdwj" podStartSLOduration=3.019654183 podStartE2EDuration="56.41238071s" podCreationTimestamp="2025-09-30 13:41:32 +0000 UTC" firstStartedPulling="2025-09-30 13:41:34.379404558 +0000 UTC m=+144.763406859" lastFinishedPulling="2025-09-30 13:42:27.772131095 +0000 UTC m=+198.156133386" observedRunningTime="2025-09-30 13:42:28.408485018 +0000 UTC m=+198.792487319" watchObservedRunningTime="2025-09-30 13:42:28.41238071 +0000 UTC m=+198.796383001" Sep 30 13:42:30 crc kubenswrapper[4936]: I0930 13:42:30.403909 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnh5g" event={"ID":"4c24915c-38b8-4237-bf27-0202571c82ff","Type":"ContainerStarted","Data":"89c1a7f4cd4bcd2e05a6babf41c4be73f643f7b846f50809dd731ce9617ede4b"} Sep 30 13:42:31 crc kubenswrapper[4936]: I0930 13:42:31.418864 4936 generic.go:334] "Generic (PLEG): container finished" podID="4c24915c-38b8-4237-bf27-0202571c82ff" containerID="89c1a7f4cd4bcd2e05a6babf41c4be73f643f7b846f50809dd731ce9617ede4b" exitCode=0 Sep 30 13:42:31 crc kubenswrapper[4936]: I0930 13:42:31.418884 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnh5g" event={"ID":"4c24915c-38b8-4237-bf27-0202571c82ff","Type":"ContainerDied","Data":"89c1a7f4cd4bcd2e05a6babf41c4be73f643f7b846f50809dd731ce9617ede4b"} Sep 30 13:42:31 crc kubenswrapper[4936]: E0930 13:42:31.951606 4936 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ebbf1be_a135_4aab_8f09_5d77fc1b1a60.slice/crio-051c91ce806f209484255146a19b01c7d80aaa7be707eec61387568ea4a94657.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:42:32 crc kubenswrapper[4936]: I0930 13:42:32.425567 4936 generic.go:334] "Generic (PLEG): container finished" podID="4ebbf1be-a135-4aab-8f09-5d77fc1b1a60" containerID="051c91ce806f209484255146a19b01c7d80aaa7be707eec61387568ea4a94657" exitCode=0 Sep 30 13:42:32 crc kubenswrapper[4936]: I0930 13:42:32.425663 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7pm4" event={"ID":"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60","Type":"ContainerDied","Data":"051c91ce806f209484255146a19b01c7d80aaa7be707eec61387568ea4a94657"} Sep 30 13:42:32 crc kubenswrapper[4936]: I0930 13:42:32.503153 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hsdwj" Sep 30 13:42:32 crc kubenswrapper[4936]: I0930 13:42:32.503190 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hsdwj" Sep 30 13:42:33 crc kubenswrapper[4936]: I0930 13:42:33.202388 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rh7ln" Sep 30 13:42:33 crc kubenswrapper[4936]: I0930 13:42:33.202830 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rh7ln" Sep 30 13:42:33 crc kubenswrapper[4936]: I0930 13:42:33.510775 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hl59l" Sep 30 13:42:33 crc kubenswrapper[4936]: I0930 13:42:33.510831 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hl59l" Sep 30 13:42:33 crc kubenswrapper[4936]: I0930 13:42:33.654997 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hsdwj" Sep 30 13:42:33 crc kubenswrapper[4936]: I0930 13:42:33.656096 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rh7ln" Sep 30 13:42:33 crc kubenswrapper[4936]: I0930 13:42:33.656221 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hl59l" Sep 30 13:42:33 crc kubenswrapper[4936]: I0930 13:42:33.706703 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hsdwj" Sep 30 13:42:33 crc kubenswrapper[4936]: I0930 13:42:33.727578 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rh7ln" Sep 30 13:42:34 crc kubenswrapper[4936]: I0930 13:42:34.474121 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hl59l" Sep 30 13:42:35 crc kubenswrapper[4936]: I0930 13:42:35.247026 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-thjx8" Sep 30 13:42:35 crc kubenswrapper[4936]: I0930 13:42:35.247372 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-thjx8" Sep 30 13:42:35 crc kubenswrapper[4936]: I0930 13:42:35.285584 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-thjx8" Sep 30 13:42:35 crc kubenswrapper[4936]: I0930 13:42:35.474634 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-thjx8" Sep 30 13:42:35 crc kubenswrapper[4936]: I0930 13:42:35.608155 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hl59l"] Sep 30 13:42:35 crc kubenswrapper[4936]: I0930 13:42:35.811469 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rh7ln"] Sep 30 13:42:35 crc kubenswrapper[4936]: I0930 13:42:35.811664 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rh7ln" podUID="882bdf58-1728-423e-94c1-79c8dad7ab55" containerName="registry-server" containerID="cri-o://7c610cdc270397500eef1f0de0154fdd812dbf3da2bfde3e264456ff3106e81b" gracePeriod=2 Sep 30 13:42:36 crc kubenswrapper[4936]: I0930 13:42:36.443183 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hl59l" podUID="f1141335-cd5a-4799-b931-d1e24f3d0044" containerName="registry-server" containerID="cri-o://b250838fcb241c4454be4b60285c6a7e7ff6a9e3d7aed71bfb1ef0696e1be744" gracePeriod=2 Sep 30 13:42:37 crc kubenswrapper[4936]: I0930 13:42:37.449804 4936 generic.go:334] "Generic (PLEG): container finished" podID="882bdf58-1728-423e-94c1-79c8dad7ab55" containerID="7c610cdc270397500eef1f0de0154fdd812dbf3da2bfde3e264456ff3106e81b" exitCode=0 Sep 30 13:42:37 crc kubenswrapper[4936]: I0930 13:42:37.449889 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh7ln" event={"ID":"882bdf58-1728-423e-94c1-79c8dad7ab55","Type":"ContainerDied","Data":"7c610cdc270397500eef1f0de0154fdd812dbf3da2bfde3e264456ff3106e81b"} Sep 30 13:42:38 crc kubenswrapper[4936]: I0930 13:42:38.210652 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-thjx8"] Sep 30 13:42:38 crc kubenswrapper[4936]: I0930 13:42:38.210937 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-thjx8" podUID="04ed78b8-1a01-47f3-9d6e-9af3cbf62221" containerName="registry-server" containerID="cri-o://abec549b847faed74a62d7a6c39c4f0ee1e4af0c2867c86875de0d230adcfc71" gracePeriod=2 Sep 30 13:42:38 crc kubenswrapper[4936]: I0930 13:42:38.460209 4936 generic.go:334] "Generic (PLEG): container finished" podID="f1141335-cd5a-4799-b931-d1e24f3d0044" containerID="b250838fcb241c4454be4b60285c6a7e7ff6a9e3d7aed71bfb1ef0696e1be744" exitCode=0 Sep 30 13:42:38 crc kubenswrapper[4936]: I0930 13:42:38.460250 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hl59l" event={"ID":"f1141335-cd5a-4799-b931-d1e24f3d0044","Type":"ContainerDied","Data":"b250838fcb241c4454be4b60285c6a7e7ff6a9e3d7aed71bfb1ef0696e1be744"} Sep 30 13:42:38 crc kubenswrapper[4936]: I0930 13:42:38.913496 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rh7ln" Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.086372 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/882bdf58-1728-423e-94c1-79c8dad7ab55-utilities\") pod \"882bdf58-1728-423e-94c1-79c8dad7ab55\" (UID: \"882bdf58-1728-423e-94c1-79c8dad7ab55\") " Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.086453 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/882bdf58-1728-423e-94c1-79c8dad7ab55-catalog-content\") pod \"882bdf58-1728-423e-94c1-79c8dad7ab55\" (UID: \"882bdf58-1728-423e-94c1-79c8dad7ab55\") " Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.086503 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tj4v\" (UniqueName: \"kubernetes.io/projected/882bdf58-1728-423e-94c1-79c8dad7ab55-kube-api-access-4tj4v\") pod \"882bdf58-1728-423e-94c1-79c8dad7ab55\" (UID: \"882bdf58-1728-423e-94c1-79c8dad7ab55\") " Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.093393 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/882bdf58-1728-423e-94c1-79c8dad7ab55-kube-api-access-4tj4v" (OuterVolumeSpecName: "kube-api-access-4tj4v") pod "882bdf58-1728-423e-94c1-79c8dad7ab55" (UID: "882bdf58-1728-423e-94c1-79c8dad7ab55"). InnerVolumeSpecName "kube-api-access-4tj4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.128405 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/882bdf58-1728-423e-94c1-79c8dad7ab55-utilities" (OuterVolumeSpecName: "utilities") pod "882bdf58-1728-423e-94c1-79c8dad7ab55" (UID: "882bdf58-1728-423e-94c1-79c8dad7ab55"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.187868 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tj4v\" (UniqueName: \"kubernetes.io/projected/882bdf58-1728-423e-94c1-79c8dad7ab55-kube-api-access-4tj4v\") on node \"crc\" DevicePath \"\"" Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.187925 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/882bdf58-1728-423e-94c1-79c8dad7ab55-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.327907 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/882bdf58-1728-423e-94c1-79c8dad7ab55-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "882bdf58-1728-423e-94c1-79c8dad7ab55" (UID: "882bdf58-1728-423e-94c1-79c8dad7ab55"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.390129 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/882bdf58-1728-423e-94c1-79c8dad7ab55-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.476375 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh7ln" event={"ID":"882bdf58-1728-423e-94c1-79c8dad7ab55","Type":"ContainerDied","Data":"ac1354579bc6dbda24a29b80345e8d4ecbba636ff1d7836a7f2613836c20a8ab"} Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.476438 4936 scope.go:117] "RemoveContainer" containerID="7c610cdc270397500eef1f0de0154fdd812dbf3da2bfde3e264456ff3106e81b" Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.476566 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rh7ln" Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.482768 4936 generic.go:334] "Generic (PLEG): container finished" podID="04ed78b8-1a01-47f3-9d6e-9af3cbf62221" containerID="abec549b847faed74a62d7a6c39c4f0ee1e4af0c2867c86875de0d230adcfc71" exitCode=0 Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.482815 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thjx8" event={"ID":"04ed78b8-1a01-47f3-9d6e-9af3cbf62221","Type":"ContainerDied","Data":"abec549b847faed74a62d7a6c39c4f0ee1e4af0c2867c86875de0d230adcfc71"} Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.517497 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rh7ln"] Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.521266 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rh7ln"] Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.524774 4936 scope.go:117] "RemoveContainer" containerID="64ab4bb6485a3f2755e7b634cbb203538c0e2e8e27172e1845ff711663b2cdca" Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.547613 4936 scope.go:117] "RemoveContainer" containerID="1f8f4bdf6e680ca5bd5398bea7897ecd021d429235c3c446efb5aa54e0bd89f9" Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.605192 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hl59l" Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.692942 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddf4j\" (UniqueName: \"kubernetes.io/projected/f1141335-cd5a-4799-b931-d1e24f3d0044-kube-api-access-ddf4j\") pod \"f1141335-cd5a-4799-b931-d1e24f3d0044\" (UID: \"f1141335-cd5a-4799-b931-d1e24f3d0044\") " Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.693039 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1141335-cd5a-4799-b931-d1e24f3d0044-utilities\") pod \"f1141335-cd5a-4799-b931-d1e24f3d0044\" (UID: \"f1141335-cd5a-4799-b931-d1e24f3d0044\") " Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.693126 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1141335-cd5a-4799-b931-d1e24f3d0044-catalog-content\") pod \"f1141335-cd5a-4799-b931-d1e24f3d0044\" (UID: \"f1141335-cd5a-4799-b931-d1e24f3d0044\") " Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.693853 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1141335-cd5a-4799-b931-d1e24f3d0044-utilities" (OuterVolumeSpecName: "utilities") pod "f1141335-cd5a-4799-b931-d1e24f3d0044" (UID: "f1141335-cd5a-4799-b931-d1e24f3d0044"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.695906 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1141335-cd5a-4799-b931-d1e24f3d0044-kube-api-access-ddf4j" (OuterVolumeSpecName: "kube-api-access-ddf4j") pod "f1141335-cd5a-4799-b931-d1e24f3d0044" (UID: "f1141335-cd5a-4799-b931-d1e24f3d0044"). InnerVolumeSpecName "kube-api-access-ddf4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.735523 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1141335-cd5a-4799-b931-d1e24f3d0044-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1141335-cd5a-4799-b931-d1e24f3d0044" (UID: "f1141335-cd5a-4799-b931-d1e24f3d0044"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.794410 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1141335-cd5a-4799-b931-d1e24f3d0044-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.794454 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1141335-cd5a-4799-b931-d1e24f3d0044-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:42:39 crc kubenswrapper[4936]: I0930 13:42:39.794464 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddf4j\" (UniqueName: \"kubernetes.io/projected/f1141335-cd5a-4799-b931-d1e24f3d0044-kube-api-access-ddf4j\") on node \"crc\" DevicePath \"\"" Sep 30 13:42:40 crc kubenswrapper[4936]: I0930 13:42:40.491445 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hl59l" Sep 30 13:42:40 crc kubenswrapper[4936]: I0930 13:42:40.917853 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="882bdf58-1728-423e-94c1-79c8dad7ab55" path="/var/lib/kubelet/pods/882bdf58-1728-423e-94c1-79c8dad7ab55/volumes" Sep 30 13:42:40 crc kubenswrapper[4936]: I0930 13:42:40.918827 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hl59l"] Sep 30 13:42:40 crc kubenswrapper[4936]: I0930 13:42:40.918854 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hl59l"] Sep 30 13:42:40 crc kubenswrapper[4936]: I0930 13:42:40.918896 4936 scope.go:117] "RemoveContainer" containerID="b250838fcb241c4454be4b60285c6a7e7ff6a9e3d7aed71bfb1ef0696e1be744" Sep 30 13:42:42 crc kubenswrapper[4936]: I0930 13:42:42.323629 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1141335-cd5a-4799-b931-d1e24f3d0044" path="/var/lib/kubelet/pods/f1141335-cd5a-4799-b931-d1e24f3d0044/volumes" Sep 30 13:42:42 crc kubenswrapper[4936]: I0930 13:42:42.391967 4936 scope.go:117] "RemoveContainer" containerID="daef17f407b2e380c8cec92dbe60cb72e2c1be5dc263f37d17a6b5678d0261fd" Sep 30 13:42:42 crc kubenswrapper[4936]: I0930 13:42:42.499783 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnh5g" event={"ID":"4c24915c-38b8-4237-bf27-0202571c82ff","Type":"ContainerStarted","Data":"be9abbcd941a6aca07db787f2eb094c81491a590b47655f205d52d3ccc1647c7"} Sep 30 13:42:42 crc kubenswrapper[4936]: I0930 13:42:42.501598 4936 generic.go:334] "Generic (PLEG): container finished" podID="ba953910-a090-4e61-88f2-f1718da34ce3" containerID="ba52a81f758ba13d5f8d90355419733133eed7f688ea5a462b837402ec430c0b" exitCode=0 Sep 30 13:42:42 crc kubenswrapper[4936]: I0930 13:42:42.501647 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbjhk" event={"ID":"ba953910-a090-4e61-88f2-f1718da34ce3","Type":"ContainerDied","Data":"ba52a81f758ba13d5f8d90355419733133eed7f688ea5a462b837402ec430c0b"} Sep 30 13:42:42 crc kubenswrapper[4936]: I0930 13:42:42.634096 4936 scope.go:117] "RemoveContainer" containerID="e6f2b8a38e21c53aad1076d43c0af24e8b46f526a5a339bd88e128784bc0e96e" Sep 30 13:42:42 crc kubenswrapper[4936]: I0930 13:42:42.702781 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-thjx8" Sep 30 13:42:42 crc kubenswrapper[4936]: I0930 13:42:42.729826 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tnh5g" podStartSLOduration=5.781001669 podStartE2EDuration="1m7.72981042s" podCreationTimestamp="2025-09-30 13:41:35 +0000 UTC" firstStartedPulling="2025-09-30 13:41:37.576054803 +0000 UTC m=+147.960057104" lastFinishedPulling="2025-09-30 13:42:39.524863554 +0000 UTC m=+209.908865855" observedRunningTime="2025-09-30 13:42:42.519163303 +0000 UTC m=+212.903165594" watchObservedRunningTime="2025-09-30 13:42:42.72981042 +0000 UTC m=+213.113812721" Sep 30 13:42:42 crc kubenswrapper[4936]: I0930 13:42:42.840903 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04ed78b8-1a01-47f3-9d6e-9af3cbf62221-utilities\") pod \"04ed78b8-1a01-47f3-9d6e-9af3cbf62221\" (UID: \"04ed78b8-1a01-47f3-9d6e-9af3cbf62221\") " Sep 30 13:42:42 crc kubenswrapper[4936]: I0930 13:42:42.840948 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04ed78b8-1a01-47f3-9d6e-9af3cbf62221-catalog-content\") pod \"04ed78b8-1a01-47f3-9d6e-9af3cbf62221\" (UID: \"04ed78b8-1a01-47f3-9d6e-9af3cbf62221\") " Sep 30 13:42:42 crc kubenswrapper[4936]: I0930 13:42:42.840969 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gblq6\" (UniqueName: \"kubernetes.io/projected/04ed78b8-1a01-47f3-9d6e-9af3cbf62221-kube-api-access-gblq6\") pod \"04ed78b8-1a01-47f3-9d6e-9af3cbf62221\" (UID: \"04ed78b8-1a01-47f3-9d6e-9af3cbf62221\") " Sep 30 13:42:42 crc kubenswrapper[4936]: I0930 13:42:42.842384 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04ed78b8-1a01-47f3-9d6e-9af3cbf62221-utilities" (OuterVolumeSpecName: "utilities") pod "04ed78b8-1a01-47f3-9d6e-9af3cbf62221" (UID: "04ed78b8-1a01-47f3-9d6e-9af3cbf62221"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:42:42 crc kubenswrapper[4936]: I0930 13:42:42.846757 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04ed78b8-1a01-47f3-9d6e-9af3cbf62221-kube-api-access-gblq6" (OuterVolumeSpecName: "kube-api-access-gblq6") pod "04ed78b8-1a01-47f3-9d6e-9af3cbf62221" (UID: "04ed78b8-1a01-47f3-9d6e-9af3cbf62221"). InnerVolumeSpecName "kube-api-access-gblq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:42:42 crc kubenswrapper[4936]: I0930 13:42:42.848149 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04ed78b8-1a01-47f3-9d6e-9af3cbf62221-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:42:42 crc kubenswrapper[4936]: I0930 13:42:42.848167 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gblq6\" (UniqueName: \"kubernetes.io/projected/04ed78b8-1a01-47f3-9d6e-9af3cbf62221-kube-api-access-gblq6\") on node \"crc\" DevicePath \"\"" Sep 30 13:42:42 crc kubenswrapper[4936]: I0930 13:42:42.857002 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04ed78b8-1a01-47f3-9d6e-9af3cbf62221-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04ed78b8-1a01-47f3-9d6e-9af3cbf62221" (UID: "04ed78b8-1a01-47f3-9d6e-9af3cbf62221"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:42:42 crc kubenswrapper[4936]: I0930 13:42:42.949373 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04ed78b8-1a01-47f3-9d6e-9af3cbf62221-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:42:43 crc kubenswrapper[4936]: I0930 13:42:43.509596 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7pm4" event={"ID":"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60","Type":"ContainerStarted","Data":"b148a1d4fc5970de094a33ec448ea8943f43cd138bd7e951b6f66fa8808c70c1"} Sep 30 13:42:43 crc kubenswrapper[4936]: I0930 13:42:43.513317 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thjx8" event={"ID":"04ed78b8-1a01-47f3-9d6e-9af3cbf62221","Type":"ContainerDied","Data":"98d707a1113d25a80350c2949cbc539c5069062301b2651e31f31883f2c7c2a1"} Sep 30 13:42:43 crc kubenswrapper[4936]: I0930 13:42:43.513361 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-thjx8" Sep 30 13:42:43 crc kubenswrapper[4936]: I0930 13:42:43.513397 4936 scope.go:117] "RemoveContainer" containerID="abec549b847faed74a62d7a6c39c4f0ee1e4af0c2867c86875de0d230adcfc71" Sep 30 13:42:43 crc kubenswrapper[4936]: I0930 13:42:43.515454 4936 generic.go:334] "Generic (PLEG): container finished" podID="25d411f6-43fa-4ee8-a820-b5f77a3d94ac" containerID="0a3c2f0305bb69c3d870c0ca91b976ab06c3e655653c8c978d2a298a68abbac3" exitCode=0 Sep 30 13:42:43 crc kubenswrapper[4936]: I0930 13:42:43.515480 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6dxk" event={"ID":"25d411f6-43fa-4ee8-a820-b5f77a3d94ac","Type":"ContainerDied","Data":"0a3c2f0305bb69c3d870c0ca91b976ab06c3e655653c8c978d2a298a68abbac3"} Sep 30 13:42:43 crc kubenswrapper[4936]: I0930 13:42:43.535802 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l7pm4" podStartSLOduration=2.318955605 podStartE2EDuration="1m8.535782274s" podCreationTimestamp="2025-09-30 13:41:35 +0000 UTC" firstStartedPulling="2025-09-30 13:41:36.492550554 +0000 UTC m=+146.876552855" lastFinishedPulling="2025-09-30 13:42:42.709377233 +0000 UTC m=+213.093379524" observedRunningTime="2025-09-30 13:42:43.53275183 +0000 UTC m=+213.916754141" watchObservedRunningTime="2025-09-30 13:42:43.535782274 +0000 UTC m=+213.919784575" Sep 30 13:42:43 crc kubenswrapper[4936]: I0930 13:42:43.545576 4936 scope.go:117] "RemoveContainer" containerID="f18d4c2728942a7e65a12b7f96d23ab8dbc5312a4ed604b5c1cb15f8e9751758" Sep 30 13:42:43 crc kubenswrapper[4936]: I0930 13:42:43.560078 4936 scope.go:117] "RemoveContainer" containerID="5c12cc6fc175c198a81f253490668ad91aa143d4151b9e0d1765b7bd0a75d39b" Sep 30 13:42:43 crc kubenswrapper[4936]: I0930 13:42:43.603277 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-thjx8"] Sep 30 13:42:43 crc kubenswrapper[4936]: I0930 13:42:43.608033 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-thjx8"] Sep 30 13:42:44 crc kubenswrapper[4936]: I0930 13:42:44.321159 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04ed78b8-1a01-47f3-9d6e-9af3cbf62221" path="/var/lib/kubelet/pods/04ed78b8-1a01-47f3-9d6e-9af3cbf62221/volumes" Sep 30 13:42:44 crc kubenswrapper[4936]: I0930 13:42:44.522988 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbjhk" event={"ID":"ba953910-a090-4e61-88f2-f1718da34ce3","Type":"ContainerStarted","Data":"6a29ca35f743dd7c827c3edc866cd74f8c43466954b9ed2c7c264993b8f282d2"} Sep 30 13:42:44 crc kubenswrapper[4936]: I0930 13:42:44.526233 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6dxk" event={"ID":"25d411f6-43fa-4ee8-a820-b5f77a3d94ac","Type":"ContainerStarted","Data":"75ecf99a02c44e96af75f154aa05f953863a2a9ac08ac5f783a2b1f6b904a008"} Sep 30 13:42:44 crc kubenswrapper[4936]: I0930 13:42:44.539189 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sbjhk" podStartSLOduration=3.058166614 podStartE2EDuration="1m10.539174638s" podCreationTimestamp="2025-09-30 13:41:34 +0000 UTC" firstStartedPulling="2025-09-30 13:41:36.563119696 +0000 UTC m=+146.947121997" lastFinishedPulling="2025-09-30 13:42:44.04412772 +0000 UTC m=+214.428130021" observedRunningTime="2025-09-30 13:42:44.537512383 +0000 UTC m=+214.921514694" watchObservedRunningTime="2025-09-30 13:42:44.539174638 +0000 UTC m=+214.923176939" Sep 30 13:42:44 crc kubenswrapper[4936]: I0930 13:42:44.874737 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sbjhk" Sep 30 13:42:44 crc kubenswrapper[4936]: I0930 13:42:44.874789 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sbjhk" Sep 30 13:42:45 crc kubenswrapper[4936]: I0930 13:42:45.010479 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m6dxk" podStartSLOduration=4.256479926 podStartE2EDuration="1m13.010463989s" podCreationTimestamp="2025-09-30 13:41:32 +0000 UTC" firstStartedPulling="2025-09-30 13:41:35.404403088 +0000 UTC m=+145.788405389" lastFinishedPulling="2025-09-30 13:42:44.158387151 +0000 UTC m=+214.542389452" observedRunningTime="2025-09-30 13:42:44.557358918 +0000 UTC m=+214.941361229" watchObservedRunningTime="2025-09-30 13:42:45.010463989 +0000 UTC m=+215.394466290" Sep 30 13:42:45 crc kubenswrapper[4936]: I0930 13:42:45.013238 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7ccwq"] Sep 30 13:42:45 crc kubenswrapper[4936]: I0930 13:42:45.646562 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l7pm4" Sep 30 13:42:45 crc kubenswrapper[4936]: I0930 13:42:45.647352 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l7pm4" Sep 30 13:42:45 crc kubenswrapper[4936]: I0930 13:42:45.865870 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tnh5g" Sep 30 13:42:45 crc kubenswrapper[4936]: I0930 13:42:45.865916 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tnh5g" Sep 30 13:42:45 crc kubenswrapper[4936]: I0930 13:42:45.910567 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-sbjhk" podUID="ba953910-a090-4e61-88f2-f1718da34ce3" containerName="registry-server" probeResult="failure" output=< Sep 30 13:42:45 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 13:42:45 crc kubenswrapper[4936]: > Sep 30 13:42:46 crc kubenswrapper[4936]: I0930 13:42:46.681230 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l7pm4" podUID="4ebbf1be-a135-4aab-8f09-5d77fc1b1a60" containerName="registry-server" probeResult="failure" output=< Sep 30 13:42:46 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 13:42:46 crc kubenswrapper[4936]: > Sep 30 13:42:46 crc kubenswrapper[4936]: I0930 13:42:46.906797 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tnh5g" podUID="4c24915c-38b8-4237-bf27-0202571c82ff" containerName="registry-server" probeResult="failure" output=< Sep 30 13:42:46 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 13:42:46 crc kubenswrapper[4936]: > Sep 30 13:42:48 crc kubenswrapper[4936]: I0930 13:42:48.250599 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:42:48 crc kubenswrapper[4936]: I0930 13:42:48.250945 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:42:48 crc kubenswrapper[4936]: I0930 13:42:48.250996 4936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 13:42:48 crc kubenswrapper[4936]: I0930 13:42:48.251593 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c"} pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:42:48 crc kubenswrapper[4936]: I0930 13:42:48.251697 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" containerID="cri-o://9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c" gracePeriod=600 Sep 30 13:42:48 crc kubenswrapper[4936]: I0930 13:42:48.549585 4936 generic.go:334] "Generic (PLEG): container finished" podID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerID="9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c" exitCode=0 Sep 30 13:42:48 crc kubenswrapper[4936]: I0930 13:42:48.549665 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerDied","Data":"9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c"} Sep 30 13:42:49 crc kubenswrapper[4936]: I0930 13:42:49.555768 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"b87ca749d8eeb6b23d32d1844ff33cb2b58fa56d3b380cf22d36d326ab9f6e40"} Sep 30 13:42:53 crc kubenswrapper[4936]: I0930 13:42:53.017967 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m6dxk" Sep 30 13:42:53 crc kubenswrapper[4936]: I0930 13:42:53.018577 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m6dxk" Sep 30 13:42:53 crc kubenswrapper[4936]: I0930 13:42:53.088236 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m6dxk" Sep 30 13:42:53 crc kubenswrapper[4936]: I0930 13:42:53.608384 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m6dxk" Sep 30 13:42:54 crc kubenswrapper[4936]: I0930 13:42:54.913233 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sbjhk" Sep 30 13:42:54 crc kubenswrapper[4936]: I0930 13:42:54.949606 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sbjhk" Sep 30 13:42:55 crc kubenswrapper[4936]: I0930 13:42:55.684898 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l7pm4" Sep 30 13:42:55 crc kubenswrapper[4936]: I0930 13:42:55.745680 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l7pm4" Sep 30 13:42:55 crc kubenswrapper[4936]: I0930 13:42:55.900558 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tnh5g" Sep 30 13:42:55 crc kubenswrapper[4936]: I0930 13:42:55.943675 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tnh5g" Sep 30 13:42:57 crc kubenswrapper[4936]: I0930 13:42:57.927070 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tnh5g"] Sep 30 13:42:57 crc kubenswrapper[4936]: I0930 13:42:57.927570 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tnh5g" podUID="4c24915c-38b8-4237-bf27-0202571c82ff" containerName="registry-server" containerID="cri-o://be9abbcd941a6aca07db787f2eb094c81491a590b47655f205d52d3ccc1647c7" gracePeriod=2 Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.305602 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnh5g" Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.376292 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8b7c\" (UniqueName: \"kubernetes.io/projected/4c24915c-38b8-4237-bf27-0202571c82ff-kube-api-access-l8b7c\") pod \"4c24915c-38b8-4237-bf27-0202571c82ff\" (UID: \"4c24915c-38b8-4237-bf27-0202571c82ff\") " Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.376378 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c24915c-38b8-4237-bf27-0202571c82ff-catalog-content\") pod \"4c24915c-38b8-4237-bf27-0202571c82ff\" (UID: \"4c24915c-38b8-4237-bf27-0202571c82ff\") " Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.376424 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c24915c-38b8-4237-bf27-0202571c82ff-utilities\") pod \"4c24915c-38b8-4237-bf27-0202571c82ff\" (UID: \"4c24915c-38b8-4237-bf27-0202571c82ff\") " Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.381407 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c24915c-38b8-4237-bf27-0202571c82ff-utilities" (OuterVolumeSpecName: "utilities") pod "4c24915c-38b8-4237-bf27-0202571c82ff" (UID: "4c24915c-38b8-4237-bf27-0202571c82ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.385275 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c24915c-38b8-4237-bf27-0202571c82ff-kube-api-access-l8b7c" (OuterVolumeSpecName: "kube-api-access-l8b7c") pod "4c24915c-38b8-4237-bf27-0202571c82ff" (UID: "4c24915c-38b8-4237-bf27-0202571c82ff"). InnerVolumeSpecName "kube-api-access-l8b7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.472367 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c24915c-38b8-4237-bf27-0202571c82ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c24915c-38b8-4237-bf27-0202571c82ff" (UID: "4c24915c-38b8-4237-bf27-0202571c82ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.477824 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8b7c\" (UniqueName: \"kubernetes.io/projected/4c24915c-38b8-4237-bf27-0202571c82ff-kube-api-access-l8b7c\") on node \"crc\" DevicePath \"\"" Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.477947 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c24915c-38b8-4237-bf27-0202571c82ff-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.478015 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c24915c-38b8-4237-bf27-0202571c82ff-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.608517 4936 generic.go:334] "Generic (PLEG): container finished" podID="4c24915c-38b8-4237-bf27-0202571c82ff" containerID="be9abbcd941a6aca07db787f2eb094c81491a590b47655f205d52d3ccc1647c7" exitCode=0 Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.608717 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnh5g" event={"ID":"4c24915c-38b8-4237-bf27-0202571c82ff","Type":"ContainerDied","Data":"be9abbcd941a6aca07db787f2eb094c81491a590b47655f205d52d3ccc1647c7"} Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.608850 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnh5g" event={"ID":"4c24915c-38b8-4237-bf27-0202571c82ff","Type":"ContainerDied","Data":"12ffed31f8f75c7e5ca570b9ad207a1c829df923249376c1b6fe093cdd0904a3"} Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.608811 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnh5g" Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.608878 4936 scope.go:117] "RemoveContainer" containerID="be9abbcd941a6aca07db787f2eb094c81491a590b47655f205d52d3ccc1647c7" Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.629642 4936 scope.go:117] "RemoveContainer" containerID="89c1a7f4cd4bcd2e05a6babf41c4be73f643f7b846f50809dd731ce9617ede4b" Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.634307 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tnh5g"] Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.639132 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tnh5g"] Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.653773 4936 scope.go:117] "RemoveContainer" containerID="dda378069d956ccf63c17ed278aeab041178c63d20813a7dfeee9259a49589fc" Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.679546 4936 scope.go:117] "RemoveContainer" containerID="be9abbcd941a6aca07db787f2eb094c81491a590b47655f205d52d3ccc1647c7" Sep 30 13:42:58 crc kubenswrapper[4936]: E0930 13:42:58.679989 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be9abbcd941a6aca07db787f2eb094c81491a590b47655f205d52d3ccc1647c7\": container with ID starting with be9abbcd941a6aca07db787f2eb094c81491a590b47655f205d52d3ccc1647c7 not found: ID does not exist" containerID="be9abbcd941a6aca07db787f2eb094c81491a590b47655f205d52d3ccc1647c7" Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.680028 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be9abbcd941a6aca07db787f2eb094c81491a590b47655f205d52d3ccc1647c7"} err="failed to get container status \"be9abbcd941a6aca07db787f2eb094c81491a590b47655f205d52d3ccc1647c7\": rpc error: code = NotFound desc = could not find container \"be9abbcd941a6aca07db787f2eb094c81491a590b47655f205d52d3ccc1647c7\": container with ID starting with be9abbcd941a6aca07db787f2eb094c81491a590b47655f205d52d3ccc1647c7 not found: ID does not exist" Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.680055 4936 scope.go:117] "RemoveContainer" containerID="89c1a7f4cd4bcd2e05a6babf41c4be73f643f7b846f50809dd731ce9617ede4b" Sep 30 13:42:58 crc kubenswrapper[4936]: E0930 13:42:58.680499 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89c1a7f4cd4bcd2e05a6babf41c4be73f643f7b846f50809dd731ce9617ede4b\": container with ID starting with 89c1a7f4cd4bcd2e05a6babf41c4be73f643f7b846f50809dd731ce9617ede4b not found: ID does not exist" containerID="89c1a7f4cd4bcd2e05a6babf41c4be73f643f7b846f50809dd731ce9617ede4b" Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.680615 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89c1a7f4cd4bcd2e05a6babf41c4be73f643f7b846f50809dd731ce9617ede4b"} err="failed to get container status \"89c1a7f4cd4bcd2e05a6babf41c4be73f643f7b846f50809dd731ce9617ede4b\": rpc error: code = NotFound desc = could not find container \"89c1a7f4cd4bcd2e05a6babf41c4be73f643f7b846f50809dd731ce9617ede4b\": container with ID starting with 89c1a7f4cd4bcd2e05a6babf41c4be73f643f7b846f50809dd731ce9617ede4b not found: ID does not exist" Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.680698 4936 scope.go:117] "RemoveContainer" containerID="dda378069d956ccf63c17ed278aeab041178c63d20813a7dfeee9259a49589fc" Sep 30 13:42:58 crc kubenswrapper[4936]: E0930 13:42:58.681119 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dda378069d956ccf63c17ed278aeab041178c63d20813a7dfeee9259a49589fc\": container with ID starting with dda378069d956ccf63c17ed278aeab041178c63d20813a7dfeee9259a49589fc not found: ID does not exist" containerID="dda378069d956ccf63c17ed278aeab041178c63d20813a7dfeee9259a49589fc" Sep 30 13:42:58 crc kubenswrapper[4936]: I0930 13:42:58.681155 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dda378069d956ccf63c17ed278aeab041178c63d20813a7dfeee9259a49589fc"} err="failed to get container status \"dda378069d956ccf63c17ed278aeab041178c63d20813a7dfeee9259a49589fc\": rpc error: code = NotFound desc = could not find container \"dda378069d956ccf63c17ed278aeab041178c63d20813a7dfeee9259a49589fc\": container with ID starting with dda378069d956ccf63c17ed278aeab041178c63d20813a7dfeee9259a49589fc not found: ID does not exist" Sep 30 13:43:00 crc kubenswrapper[4936]: I0930 13:43:00.321752 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c24915c-38b8-4237-bf27-0202571c82ff" path="/var/lib/kubelet/pods/4c24915c-38b8-4237-bf27-0202571c82ff/volumes" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.036256 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" podUID="06045f3a-af69-49c7-9759-915cd9fb4c65" containerName="oauth-openshift" containerID="cri-o://2288996231d145866b70b29c7c696474e5d84212d42c50fc0c222f7488f8a12a" gracePeriod=15 Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.426797 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.458645 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-79b5c48459-mv4jz"] Sep 30 13:43:10 crc kubenswrapper[4936]: E0930 13:43:10.458859 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1141335-cd5a-4799-b931-d1e24f3d0044" containerName="extract-content" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.458870 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1141335-cd5a-4799-b931-d1e24f3d0044" containerName="extract-content" Sep 30 13:43:10 crc kubenswrapper[4936]: E0930 13:43:10.458881 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ed78b8-1a01-47f3-9d6e-9af3cbf62221" containerName="registry-server" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.458887 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ed78b8-1a01-47f3-9d6e-9af3cbf62221" containerName="registry-server" Sep 30 13:43:10 crc kubenswrapper[4936]: E0930 13:43:10.458895 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c24915c-38b8-4237-bf27-0202571c82ff" containerName="extract-utilities" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.458901 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c24915c-38b8-4237-bf27-0202571c82ff" containerName="extract-utilities" Sep 30 13:43:10 crc kubenswrapper[4936]: E0930 13:43:10.458908 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="882bdf58-1728-423e-94c1-79c8dad7ab55" containerName="extract-content" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.458914 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="882bdf58-1728-423e-94c1-79c8dad7ab55" containerName="extract-content" Sep 30 13:43:10 crc kubenswrapper[4936]: E0930 13:43:10.458920 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ed78b8-1a01-47f3-9d6e-9af3cbf62221" containerName="extract-content" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.458925 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ed78b8-1a01-47f3-9d6e-9af3cbf62221" containerName="extract-content" Sep 30 13:43:10 crc kubenswrapper[4936]: E0930 13:43:10.458934 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1141335-cd5a-4799-b931-d1e24f3d0044" containerName="registry-server" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.458941 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1141335-cd5a-4799-b931-d1e24f3d0044" containerName="registry-server" Sep 30 13:43:10 crc kubenswrapper[4936]: E0930 13:43:10.458947 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c24915c-38b8-4237-bf27-0202571c82ff" containerName="extract-content" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.458953 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c24915c-38b8-4237-bf27-0202571c82ff" containerName="extract-content" Sep 30 13:43:10 crc kubenswrapper[4936]: E0930 13:43:10.458961 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06045f3a-af69-49c7-9759-915cd9fb4c65" containerName="oauth-openshift" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.458969 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="06045f3a-af69-49c7-9759-915cd9fb4c65" containerName="oauth-openshift" Sep 30 13:43:10 crc kubenswrapper[4936]: E0930 13:43:10.458978 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ed78b8-1a01-47f3-9d6e-9af3cbf62221" containerName="extract-utilities" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.458984 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ed78b8-1a01-47f3-9d6e-9af3cbf62221" containerName="extract-utilities" Sep 30 13:43:10 crc kubenswrapper[4936]: E0930 13:43:10.458992 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="882bdf58-1728-423e-94c1-79c8dad7ab55" containerName="registry-server" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.458998 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="882bdf58-1728-423e-94c1-79c8dad7ab55" containerName="registry-server" Sep 30 13:43:10 crc kubenswrapper[4936]: E0930 13:43:10.459008 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="882bdf58-1728-423e-94c1-79c8dad7ab55" containerName="extract-utilities" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.459013 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="882bdf58-1728-423e-94c1-79c8dad7ab55" containerName="extract-utilities" Sep 30 13:43:10 crc kubenswrapper[4936]: E0930 13:43:10.459021 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1141335-cd5a-4799-b931-d1e24f3d0044" containerName="extract-utilities" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.459027 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1141335-cd5a-4799-b931-d1e24f3d0044" containerName="extract-utilities" Sep 30 13:43:10 crc kubenswrapper[4936]: E0930 13:43:10.459036 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1f30cc6-6872-4e2c-a751-dfd2a96202c4" containerName="pruner" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.459041 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1f30cc6-6872-4e2c-a751-dfd2a96202c4" containerName="pruner" Sep 30 13:43:10 crc kubenswrapper[4936]: E0930 13:43:10.459051 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eec3e7e-81d2-4810-9fa1-5675cdfad48c" containerName="pruner" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.459056 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eec3e7e-81d2-4810-9fa1-5675cdfad48c" containerName="pruner" Sep 30 13:43:10 crc kubenswrapper[4936]: E0930 13:43:10.459063 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c24915c-38b8-4237-bf27-0202571c82ff" containerName="registry-server" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.459070 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c24915c-38b8-4237-bf27-0202571c82ff" containerName="registry-server" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.459150 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ed78b8-1a01-47f3-9d6e-9af3cbf62221" containerName="registry-server" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.459160 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="06045f3a-af69-49c7-9759-915cd9fb4c65" containerName="oauth-openshift" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.459169 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="882bdf58-1728-423e-94c1-79c8dad7ab55" containerName="registry-server" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.459175 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eec3e7e-81d2-4810-9fa1-5675cdfad48c" containerName="pruner" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.459182 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1141335-cd5a-4799-b931-d1e24f3d0044" containerName="registry-server" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.459189 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c24915c-38b8-4237-bf27-0202571c82ff" containerName="registry-server" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.459198 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1f30cc6-6872-4e2c-a751-dfd2a96202c4" containerName="pruner" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.459540 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.482140 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79b5c48459-mv4jz"] Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.520622 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-trusted-ca-bundle\") pod \"06045f3a-af69-49c7-9759-915cd9fb4c65\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.520665 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-template-login\") pod \"06045f3a-af69-49c7-9759-915cd9fb4c65\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.520695 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-audit-policies\") pod \"06045f3a-af69-49c7-9759-915cd9fb4c65\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.520731 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-router-certs\") pod \"06045f3a-af69-49c7-9759-915cd9fb4c65\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.520756 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-ocp-branding-template\") pod \"06045f3a-af69-49c7-9759-915cd9fb4c65\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.520784 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-template-error\") pod \"06045f3a-af69-49c7-9759-915cd9fb4c65\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.520814 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-idp-0-file-data\") pod \"06045f3a-af69-49c7-9759-915cd9fb4c65\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.520841 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-session\") pod \"06045f3a-af69-49c7-9759-915cd9fb4c65\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.520874 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d8f7\" (UniqueName: \"kubernetes.io/projected/06045f3a-af69-49c7-9759-915cd9fb4c65-kube-api-access-8d8f7\") pod \"06045f3a-af69-49c7-9759-915cd9fb4c65\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.520927 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-serving-cert\") pod \"06045f3a-af69-49c7-9759-915cd9fb4c65\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.520959 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06045f3a-af69-49c7-9759-915cd9fb4c65-audit-dir\") pod \"06045f3a-af69-49c7-9759-915cd9fb4c65\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.520991 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-template-provider-selection\") pod \"06045f3a-af69-49c7-9759-915cd9fb4c65\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.521018 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-service-ca\") pod \"06045f3a-af69-49c7-9759-915cd9fb4c65\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.521048 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-cliconfig\") pod \"06045f3a-af69-49c7-9759-915cd9fb4c65\" (UID: \"06045f3a-af69-49c7-9759-915cd9fb4c65\") " Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.521199 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-service-ca\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.521228 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.521258 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-audit-policies\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.521294 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-router-certs\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.521319 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtrcv\" (UniqueName: \"kubernetes.io/projected/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-kube-api-access-xtrcv\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.521351 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.521398 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.521430 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.521451 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-audit-dir\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.521471 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-user-template-error\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.521505 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-session\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.521528 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.521547 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.521568 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-user-template-login\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.521837 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "06045f3a-af69-49c7-9759-915cd9fb4c65" (UID: "06045f3a-af69-49c7-9759-915cd9fb4c65"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.521890 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/06045f3a-af69-49c7-9759-915cd9fb4c65-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "06045f3a-af69-49c7-9759-915cd9fb4c65" (UID: "06045f3a-af69-49c7-9759-915cd9fb4c65"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.522273 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "06045f3a-af69-49c7-9759-915cd9fb4c65" (UID: "06045f3a-af69-49c7-9759-915cd9fb4c65"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.526450 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "06045f3a-af69-49c7-9759-915cd9fb4c65" (UID: "06045f3a-af69-49c7-9759-915cd9fb4c65"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.526875 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "06045f3a-af69-49c7-9759-915cd9fb4c65" (UID: "06045f3a-af69-49c7-9759-915cd9fb4c65"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.529732 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "06045f3a-af69-49c7-9759-915cd9fb4c65" (UID: "06045f3a-af69-49c7-9759-915cd9fb4c65"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.530802 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "06045f3a-af69-49c7-9759-915cd9fb4c65" (UID: "06045f3a-af69-49c7-9759-915cd9fb4c65"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.531194 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "06045f3a-af69-49c7-9759-915cd9fb4c65" (UID: "06045f3a-af69-49c7-9759-915cd9fb4c65"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.533757 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "06045f3a-af69-49c7-9759-915cd9fb4c65" (UID: "06045f3a-af69-49c7-9759-915cd9fb4c65"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.534011 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "06045f3a-af69-49c7-9759-915cd9fb4c65" (UID: "06045f3a-af69-49c7-9759-915cd9fb4c65"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.534008 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "06045f3a-af69-49c7-9759-915cd9fb4c65" (UID: "06045f3a-af69-49c7-9759-915cd9fb4c65"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.536545 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06045f3a-af69-49c7-9759-915cd9fb4c65-kube-api-access-8d8f7" (OuterVolumeSpecName: "kube-api-access-8d8f7") pod "06045f3a-af69-49c7-9759-915cd9fb4c65" (UID: "06045f3a-af69-49c7-9759-915cd9fb4c65"). InnerVolumeSpecName "kube-api-access-8d8f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.537594 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "06045f3a-af69-49c7-9759-915cd9fb4c65" (UID: "06045f3a-af69-49c7-9759-915cd9fb4c65"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.537894 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "06045f3a-af69-49c7-9759-915cd9fb4c65" (UID: "06045f3a-af69-49c7-9759-915cd9fb4c65"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.622581 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-user-template-error\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.622660 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-session\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.622689 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.622713 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.622736 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-user-template-login\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.622768 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-service-ca\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.622792 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.622824 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-audit-policies\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.622857 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-router-certs\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.622880 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtrcv\" (UniqueName: \"kubernetes.io/projected/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-kube-api-access-xtrcv\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.623158 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.623219 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.623278 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.623310 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-audit-dir\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.623393 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.623412 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.623423 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.623433 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.623443 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.623453 4936 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06045f3a-af69-49c7-9759-915cd9fb4c65-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.623464 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.623477 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.623492 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.623501 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.623511 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.623520 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d8f7\" (UniqueName: \"kubernetes.io/projected/06045f3a-af69-49c7-9759-915cd9fb4c65-kube-api-access-8d8f7\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.623529 4936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06045f3a-af69-49c7-9759-915cd9fb4c65-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.623542 4936 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06045f3a-af69-49c7-9759-915cd9fb4c65-audit-dir\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.623580 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-audit-dir\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.624836 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.625119 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-audit-policies\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.625827 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-user-template-error\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.625972 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-service-ca\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.626269 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.626313 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.628603 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-user-template-login\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.628888 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-session\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.629283 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.629590 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.631404 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.633728 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-v4-0-config-system-router-certs\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.639802 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtrcv\" (UniqueName: \"kubernetes.io/projected/7f9a3bae-1956-4fc1-a400-1caabe91d7bb-kube-api-access-xtrcv\") pod \"oauth-openshift-79b5c48459-mv4jz\" (UID: \"7f9a3bae-1956-4fc1-a400-1caabe91d7bb\") " pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.670285 4936 generic.go:334] "Generic (PLEG): container finished" podID="06045f3a-af69-49c7-9759-915cd9fb4c65" containerID="2288996231d145866b70b29c7c696474e5d84212d42c50fc0c222f7488f8a12a" exitCode=0 Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.670327 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" event={"ID":"06045f3a-af69-49c7-9759-915cd9fb4c65","Type":"ContainerDied","Data":"2288996231d145866b70b29c7c696474e5d84212d42c50fc0c222f7488f8a12a"} Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.670376 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.670395 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7ccwq" event={"ID":"06045f3a-af69-49c7-9759-915cd9fb4c65","Type":"ContainerDied","Data":"4a2a1c7048ba45b8b9179f95dee2de30ea3cfb74c638d2bcb88b740b46f15575"} Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.670415 4936 scope.go:117] "RemoveContainer" containerID="2288996231d145866b70b29c7c696474e5d84212d42c50fc0c222f7488f8a12a" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.694816 4936 scope.go:117] "RemoveContainer" containerID="2288996231d145866b70b29c7c696474e5d84212d42c50fc0c222f7488f8a12a" Sep 30 13:43:10 crc kubenswrapper[4936]: E0930 13:43:10.696578 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2288996231d145866b70b29c7c696474e5d84212d42c50fc0c222f7488f8a12a\": container with ID starting with 2288996231d145866b70b29c7c696474e5d84212d42c50fc0c222f7488f8a12a not found: ID does not exist" containerID="2288996231d145866b70b29c7c696474e5d84212d42c50fc0c222f7488f8a12a" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.696627 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2288996231d145866b70b29c7c696474e5d84212d42c50fc0c222f7488f8a12a"} err="failed to get container status \"2288996231d145866b70b29c7c696474e5d84212d42c50fc0c222f7488f8a12a\": rpc error: code = NotFound desc = could not find container \"2288996231d145866b70b29c7c696474e5d84212d42c50fc0c222f7488f8a12a\": container with ID starting with 2288996231d145866b70b29c7c696474e5d84212d42c50fc0c222f7488f8a12a not found: ID does not exist" Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.700295 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7ccwq"] Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.703880 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7ccwq"] Sep 30 13:43:10 crc kubenswrapper[4936]: I0930 13:43:10.786545 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:11 crc kubenswrapper[4936]: I0930 13:43:11.159134 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79b5c48459-mv4jz"] Sep 30 13:43:11 crc kubenswrapper[4936]: I0930 13:43:11.678380 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" event={"ID":"7f9a3bae-1956-4fc1-a400-1caabe91d7bb","Type":"ContainerStarted","Data":"f67f4cb3a255d4bfb5c2656270dd7b7b291669c8f255b9658fc5b0ec4d85c1f2"} Sep 30 13:43:11 crc kubenswrapper[4936]: I0930 13:43:11.678715 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" event={"ID":"7f9a3bae-1956-4fc1-a400-1caabe91d7bb","Type":"ContainerStarted","Data":"370d0193f7170c9f6e131ab9e0e6c340c4b70e0b5aa1605e23062795a2fe2ca5"} Sep 30 13:43:11 crc kubenswrapper[4936]: I0930 13:43:11.679161 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:11 crc kubenswrapper[4936]: I0930 13:43:11.683989 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" Sep 30 13:43:11 crc kubenswrapper[4936]: I0930 13:43:11.701616 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-79b5c48459-mv4jz" podStartSLOduration=26.701597285 podStartE2EDuration="26.701597285s" podCreationTimestamp="2025-09-30 13:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:43:11.698976143 +0000 UTC m=+242.082978444" watchObservedRunningTime="2025-09-30 13:43:11.701597285 +0000 UTC m=+242.085599586" Sep 30 13:43:12 crc kubenswrapper[4936]: I0930 13:43:12.377626 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06045f3a-af69-49c7-9759-915cd9fb4c65" path="/var/lib/kubelet/pods/06045f3a-af69-49c7-9759-915cd9fb4c65/volumes" Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.475436 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m6dxk"] Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.476523 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m6dxk" podUID="25d411f6-43fa-4ee8-a820-b5f77a3d94ac" containerName="registry-server" containerID="cri-o://75ecf99a02c44e96af75f154aa05f953863a2a9ac08ac5f783a2b1f6b904a008" gracePeriod=30 Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.478534 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hsdwj"] Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.478802 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hsdwj" podUID="eea297dc-9577-48fb-b7b4-e1731a2d44d7" containerName="registry-server" containerID="cri-o://fd114c3e587a0fd263ff933164e6e700bc0ca89b5bf33d8428a2850f57f78d83" gracePeriod=30 Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.485198 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7jxz9"] Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.485457 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" podUID="a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f" containerName="marketplace-operator" containerID="cri-o://8daf1693a5932fe145210cd7ec8209aeb27b7b487d521e7fe1388715836e4e26" gracePeriod=30 Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.490695 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbjhk"] Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.490925 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sbjhk" podUID="ba953910-a090-4e61-88f2-f1718da34ce3" containerName="registry-server" containerID="cri-o://6a29ca35f743dd7c827c3edc866cd74f8c43466954b9ed2c7c264993b8f282d2" gracePeriod=30 Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.495370 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l7pm4"] Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.495641 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l7pm4" podUID="4ebbf1be-a135-4aab-8f09-5d77fc1b1a60" containerName="registry-server" containerID="cri-o://b148a1d4fc5970de094a33ec448ea8943f43cd138bd7e951b6f66fa8808c70c1" gracePeriod=30 Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.508238 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q587x"] Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.509081 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q587x" Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.530679 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q587x"] Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.636231 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18fdb3dd-ed9e-4625-9bb8-7f2a079396dd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q587x\" (UID: \"18fdb3dd-ed9e-4625-9bb8-7f2a079396dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-q587x" Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.636294 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg2ph\" (UniqueName: \"kubernetes.io/projected/18fdb3dd-ed9e-4625-9bb8-7f2a079396dd-kube-api-access-qg2ph\") pod \"marketplace-operator-79b997595-q587x\" (UID: \"18fdb3dd-ed9e-4625-9bb8-7f2a079396dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-q587x" Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.636318 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/18fdb3dd-ed9e-4625-9bb8-7f2a079396dd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q587x\" (UID: \"18fdb3dd-ed9e-4625-9bb8-7f2a079396dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-q587x" Sep 30 13:43:25 crc kubenswrapper[4936]: E0930 13:43:25.649989 4936 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b148a1d4fc5970de094a33ec448ea8943f43cd138bd7e951b6f66fa8808c70c1 is running failed: container process not found" containerID="b148a1d4fc5970de094a33ec448ea8943f43cd138bd7e951b6f66fa8808c70c1" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 13:43:25 crc kubenswrapper[4936]: E0930 13:43:25.653750 4936 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b148a1d4fc5970de094a33ec448ea8943f43cd138bd7e951b6f66fa8808c70c1 is running failed: container process not found" containerID="b148a1d4fc5970de094a33ec448ea8943f43cd138bd7e951b6f66fa8808c70c1" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 13:43:25 crc kubenswrapper[4936]: E0930 13:43:25.654417 4936 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b148a1d4fc5970de094a33ec448ea8943f43cd138bd7e951b6f66fa8808c70c1 is running failed: container process not found" containerID="b148a1d4fc5970de094a33ec448ea8943f43cd138bd7e951b6f66fa8808c70c1" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 13:43:25 crc kubenswrapper[4936]: E0930 13:43:25.654513 4936 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b148a1d4fc5970de094a33ec448ea8943f43cd138bd7e951b6f66fa8808c70c1 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-l7pm4" podUID="4ebbf1be-a135-4aab-8f09-5d77fc1b1a60" containerName="registry-server" Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.737180 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18fdb3dd-ed9e-4625-9bb8-7f2a079396dd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q587x\" (UID: \"18fdb3dd-ed9e-4625-9bb8-7f2a079396dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-q587x" Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.737249 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg2ph\" (UniqueName: \"kubernetes.io/projected/18fdb3dd-ed9e-4625-9bb8-7f2a079396dd-kube-api-access-qg2ph\") pod \"marketplace-operator-79b997595-q587x\" (UID: \"18fdb3dd-ed9e-4625-9bb8-7f2a079396dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-q587x" Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.737267 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/18fdb3dd-ed9e-4625-9bb8-7f2a079396dd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q587x\" (UID: \"18fdb3dd-ed9e-4625-9bb8-7f2a079396dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-q587x" Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.738368 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18fdb3dd-ed9e-4625-9bb8-7f2a079396dd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q587x\" (UID: \"18fdb3dd-ed9e-4625-9bb8-7f2a079396dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-q587x" Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.745044 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/18fdb3dd-ed9e-4625-9bb8-7f2a079396dd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q587x\" (UID: \"18fdb3dd-ed9e-4625-9bb8-7f2a079396dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-q587x" Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.754600 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg2ph\" (UniqueName: \"kubernetes.io/projected/18fdb3dd-ed9e-4625-9bb8-7f2a079396dd-kube-api-access-qg2ph\") pod \"marketplace-operator-79b997595-q587x\" (UID: \"18fdb3dd-ed9e-4625-9bb8-7f2a079396dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-q587x" Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.765874 4936 generic.go:334] "Generic (PLEG): container finished" podID="a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f" containerID="8daf1693a5932fe145210cd7ec8209aeb27b7b487d521e7fe1388715836e4e26" exitCode=0 Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.765960 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" event={"ID":"a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f","Type":"ContainerDied","Data":"8daf1693a5932fe145210cd7ec8209aeb27b7b487d521e7fe1388715836e4e26"} Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.775323 4936 generic.go:334] "Generic (PLEG): container finished" podID="ba953910-a090-4e61-88f2-f1718da34ce3" containerID="6a29ca35f743dd7c827c3edc866cd74f8c43466954b9ed2c7c264993b8f282d2" exitCode=0 Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.775635 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbjhk" event={"ID":"ba953910-a090-4e61-88f2-f1718da34ce3","Type":"ContainerDied","Data":"6a29ca35f743dd7c827c3edc866cd74f8c43466954b9ed2c7c264993b8f282d2"} Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.777556 4936 generic.go:334] "Generic (PLEG): container finished" podID="eea297dc-9577-48fb-b7b4-e1731a2d44d7" containerID="fd114c3e587a0fd263ff933164e6e700bc0ca89b5bf33d8428a2850f57f78d83" exitCode=0 Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.777700 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsdwj" event={"ID":"eea297dc-9577-48fb-b7b4-e1731a2d44d7","Type":"ContainerDied","Data":"fd114c3e587a0fd263ff933164e6e700bc0ca89b5bf33d8428a2850f57f78d83"} Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.785899 4936 generic.go:334] "Generic (PLEG): container finished" podID="25d411f6-43fa-4ee8-a820-b5f77a3d94ac" containerID="75ecf99a02c44e96af75f154aa05f953863a2a9ac08ac5f783a2b1f6b904a008" exitCode=0 Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.785963 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6dxk" event={"ID":"25d411f6-43fa-4ee8-a820-b5f77a3d94ac","Type":"ContainerDied","Data":"75ecf99a02c44e96af75f154aa05f953863a2a9ac08ac5f783a2b1f6b904a008"} Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.794246 4936 generic.go:334] "Generic (PLEG): container finished" podID="4ebbf1be-a135-4aab-8f09-5d77fc1b1a60" containerID="b148a1d4fc5970de094a33ec448ea8943f43cd138bd7e951b6f66fa8808c70c1" exitCode=0 Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.794300 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7pm4" event={"ID":"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60","Type":"ContainerDied","Data":"b148a1d4fc5970de094a33ec448ea8943f43cd138bd7e951b6f66fa8808c70c1"} Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.827838 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q587x" Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.913358 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hsdwj" Sep 30 13:43:25 crc kubenswrapper[4936]: I0930 13:43:25.975504 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7pm4" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.033324 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.041042 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eea297dc-9577-48fb-b7b4-e1731a2d44d7-utilities\") pod \"eea297dc-9577-48fb-b7b4-e1731a2d44d7\" (UID: \"eea297dc-9577-48fb-b7b4-e1731a2d44d7\") " Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.041089 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eea297dc-9577-48fb-b7b4-e1731a2d44d7-catalog-content\") pod \"eea297dc-9577-48fb-b7b4-e1731a2d44d7\" (UID: \"eea297dc-9577-48fb-b7b4-e1731a2d44d7\") " Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.041120 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ebbf1be-a135-4aab-8f09-5d77fc1b1a60-utilities\") pod \"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60\" (UID: \"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60\") " Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.041181 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ebbf1be-a135-4aab-8f09-5d77fc1b1a60-catalog-content\") pod \"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60\" (UID: \"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60\") " Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.041202 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9tmr\" (UniqueName: \"kubernetes.io/projected/eea297dc-9577-48fb-b7b4-e1731a2d44d7-kube-api-access-n9tmr\") pod \"eea297dc-9577-48fb-b7b4-e1731a2d44d7\" (UID: \"eea297dc-9577-48fb-b7b4-e1731a2d44d7\") " Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.041237 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdfll\" (UniqueName: \"kubernetes.io/projected/4ebbf1be-a135-4aab-8f09-5d77fc1b1a60-kube-api-access-mdfll\") pod \"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60\" (UID: \"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60\") " Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.043410 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ebbf1be-a135-4aab-8f09-5d77fc1b1a60-utilities" (OuterVolumeSpecName: "utilities") pod "4ebbf1be-a135-4aab-8f09-5d77fc1b1a60" (UID: "4ebbf1be-a135-4aab-8f09-5d77fc1b1a60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.047836 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eea297dc-9577-48fb-b7b4-e1731a2d44d7-kube-api-access-n9tmr" (OuterVolumeSpecName: "kube-api-access-n9tmr") pod "eea297dc-9577-48fb-b7b4-e1731a2d44d7" (UID: "eea297dc-9577-48fb-b7b4-e1731a2d44d7"). InnerVolumeSpecName "kube-api-access-n9tmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.056558 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eea297dc-9577-48fb-b7b4-e1731a2d44d7-utilities" (OuterVolumeSpecName: "utilities") pod "eea297dc-9577-48fb-b7b4-e1731a2d44d7" (UID: "eea297dc-9577-48fb-b7b4-e1731a2d44d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.077543 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ebbf1be-a135-4aab-8f09-5d77fc1b1a60-kube-api-access-mdfll" (OuterVolumeSpecName: "kube-api-access-mdfll") pod "4ebbf1be-a135-4aab-8f09-5d77fc1b1a60" (UID: "4ebbf1be-a135-4aab-8f09-5d77fc1b1a60"). InnerVolumeSpecName "kube-api-access-mdfll". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.083527 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m6dxk" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.085855 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbjhk" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.144143 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f-marketplace-trusted-ca\") pod \"a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f\" (UID: \"a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f\") " Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.144863 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc486\" (UniqueName: \"kubernetes.io/projected/25d411f6-43fa-4ee8-a820-b5f77a3d94ac-kube-api-access-xc486\") pod \"25d411f6-43fa-4ee8-a820-b5f77a3d94ac\" (UID: \"25d411f6-43fa-4ee8-a820-b5f77a3d94ac\") " Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.144996 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25d411f6-43fa-4ee8-a820-b5f77a3d94ac-catalog-content\") pod \"25d411f6-43fa-4ee8-a820-b5f77a3d94ac\" (UID: \"25d411f6-43fa-4ee8-a820-b5f77a3d94ac\") " Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.145390 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6zjk\" (UniqueName: \"kubernetes.io/projected/a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f-kube-api-access-x6zjk\") pod \"a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f\" (UID: \"a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f\") " Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.145835 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eea297dc-9577-48fb-b7b4-e1731a2d44d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eea297dc-9577-48fb-b7b4-e1731a2d44d7" (UID: "eea297dc-9577-48fb-b7b4-e1731a2d44d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.146654 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25d411f6-43fa-4ee8-a820-b5f77a3d94ac-utilities" (OuterVolumeSpecName: "utilities") pod "25d411f6-43fa-4ee8-a820-b5f77a3d94ac" (UID: "25d411f6-43fa-4ee8-a820-b5f77a3d94ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.145649 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25d411f6-43fa-4ee8-a820-b5f77a3d94ac-utilities\") pod \"25d411f6-43fa-4ee8-a820-b5f77a3d94ac\" (UID: \"25d411f6-43fa-4ee8-a820-b5f77a3d94ac\") " Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.147561 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f-marketplace-operator-metrics\") pod \"a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f\" (UID: \"a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f\") " Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.147760 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eea297dc-9577-48fb-b7b4-e1731a2d44d7-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.147773 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eea297dc-9577-48fb-b7b4-e1731a2d44d7-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.147782 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25d411f6-43fa-4ee8-a820-b5f77a3d94ac-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.147790 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ebbf1be-a135-4aab-8f09-5d77fc1b1a60-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.147797 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9tmr\" (UniqueName: \"kubernetes.io/projected/eea297dc-9577-48fb-b7b4-e1731a2d44d7-kube-api-access-n9tmr\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.147807 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdfll\" (UniqueName: \"kubernetes.io/projected/4ebbf1be-a135-4aab-8f09-5d77fc1b1a60-kube-api-access-mdfll\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.152463 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d411f6-43fa-4ee8-a820-b5f77a3d94ac-kube-api-access-xc486" (OuterVolumeSpecName: "kube-api-access-xc486") pod "25d411f6-43fa-4ee8-a820-b5f77a3d94ac" (UID: "25d411f6-43fa-4ee8-a820-b5f77a3d94ac"). InnerVolumeSpecName "kube-api-access-xc486". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.154025 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f" (UID: "a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.157090 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f" (UID: "a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.160671 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f-kube-api-access-x6zjk" (OuterVolumeSpecName: "kube-api-access-x6zjk") pod "a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f" (UID: "a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f"). InnerVolumeSpecName "kube-api-access-x6zjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.205389 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25d411f6-43fa-4ee8-a820-b5f77a3d94ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25d411f6-43fa-4ee8-a820-b5f77a3d94ac" (UID: "25d411f6-43fa-4ee8-a820-b5f77a3d94ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.230102 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ebbf1be-a135-4aab-8f09-5d77fc1b1a60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ebbf1be-a135-4aab-8f09-5d77fc1b1a60" (UID: "4ebbf1be-a135-4aab-8f09-5d77fc1b1a60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.248908 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba953910-a090-4e61-88f2-f1718da34ce3-utilities\") pod \"ba953910-a090-4e61-88f2-f1718da34ce3\" (UID: \"ba953910-a090-4e61-88f2-f1718da34ce3\") " Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.248967 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba953910-a090-4e61-88f2-f1718da34ce3-catalog-content\") pod \"ba953910-a090-4e61-88f2-f1718da34ce3\" (UID: \"ba953910-a090-4e61-88f2-f1718da34ce3\") " Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.249055 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxn4d\" (UniqueName: \"kubernetes.io/projected/ba953910-a090-4e61-88f2-f1718da34ce3-kube-api-access-rxn4d\") pod \"ba953910-a090-4e61-88f2-f1718da34ce3\" (UID: \"ba953910-a090-4e61-88f2-f1718da34ce3\") " Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.249278 4936 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.249290 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ebbf1be-a135-4aab-8f09-5d77fc1b1a60-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.249299 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc486\" (UniqueName: \"kubernetes.io/projected/25d411f6-43fa-4ee8-a820-b5f77a3d94ac-kube-api-access-xc486\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.249308 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25d411f6-43fa-4ee8-a820-b5f77a3d94ac-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.249345 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6zjk\" (UniqueName: \"kubernetes.io/projected/a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f-kube-api-access-x6zjk\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.249355 4936 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.250438 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba953910-a090-4e61-88f2-f1718da34ce3-utilities" (OuterVolumeSpecName: "utilities") pod "ba953910-a090-4e61-88f2-f1718da34ce3" (UID: "ba953910-a090-4e61-88f2-f1718da34ce3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.251592 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba953910-a090-4e61-88f2-f1718da34ce3-kube-api-access-rxn4d" (OuterVolumeSpecName: "kube-api-access-rxn4d") pod "ba953910-a090-4e61-88f2-f1718da34ce3" (UID: "ba953910-a090-4e61-88f2-f1718da34ce3"). InnerVolumeSpecName "kube-api-access-rxn4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.266288 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba953910-a090-4e61-88f2-f1718da34ce3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba953910-a090-4e61-88f2-f1718da34ce3" (UID: "ba953910-a090-4e61-88f2-f1718da34ce3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.350639 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxn4d\" (UniqueName: \"kubernetes.io/projected/ba953910-a090-4e61-88f2-f1718da34ce3-kube-api-access-rxn4d\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.350666 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba953910-a090-4e61-88f2-f1718da34ce3-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.350676 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba953910-a090-4e61-88f2-f1718da34ce3-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.383349 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q587x"] Sep 30 13:43:26 crc kubenswrapper[4936]: W0930 13:43:26.389860 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18fdb3dd_ed9e_4625_9bb8_7f2a079396dd.slice/crio-b14617d3e462a66ebfb10e3f42628ec6b494751d7297a2d5a26b88124d3c1a9c WatchSource:0}: Error finding container b14617d3e462a66ebfb10e3f42628ec6b494751d7297a2d5a26b88124d3c1a9c: Status 404 returned error can't find the container with id b14617d3e462a66ebfb10e3f42628ec6b494751d7297a2d5a26b88124d3c1a9c Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.807823 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7pm4" event={"ID":"4ebbf1be-a135-4aab-8f09-5d77fc1b1a60","Type":"ContainerDied","Data":"3fe5fc1711f845ec3c73a5381f923c47115864b2a31e5300eee3664d0981eaa7"} Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.808207 4936 scope.go:117] "RemoveContainer" containerID="b148a1d4fc5970de094a33ec448ea8943f43cd138bd7e951b6f66fa8808c70c1" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.808428 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7pm4" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.812925 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" event={"ID":"a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f","Type":"ContainerDied","Data":"b2cff9db225280d1f476c5b293dc582ebeec76fb57351f3ee7a7d5650dfd75b2"} Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.813013 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7jxz9" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.816148 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q587x" event={"ID":"18fdb3dd-ed9e-4625-9bb8-7f2a079396dd","Type":"ContainerStarted","Data":"6c957b9ea60ad7dc856ac7f3670c06af45e043e76ffaad7949e66db86dbf5209"} Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.816185 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q587x" event={"ID":"18fdb3dd-ed9e-4625-9bb8-7f2a079396dd","Type":"ContainerStarted","Data":"b14617d3e462a66ebfb10e3f42628ec6b494751d7297a2d5a26b88124d3c1a9c"} Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.816751 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-q587x" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.821370 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbjhk" event={"ID":"ba953910-a090-4e61-88f2-f1718da34ce3","Type":"ContainerDied","Data":"20b7b0a3320999900d861cb1fee1938352eeb80ca7790b8ff512dae76ea2e3d7"} Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.821379 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbjhk" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.826539 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l7pm4"] Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.827640 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsdwj" event={"ID":"eea297dc-9577-48fb-b7b4-e1731a2d44d7","Type":"ContainerDied","Data":"4abc791839984b959fd82524c274906aee4821f895a3a3cec4b2312444457021"} Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.827667 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hsdwj" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.831481 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6dxk" event={"ID":"25d411f6-43fa-4ee8-a820-b5f77a3d94ac","Type":"ContainerDied","Data":"67c47c2b56cfe0dd95ad4c644890a2c2a852ac78e82261ba734416593fa8c223"} Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.831553 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m6dxk" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.833115 4936 scope.go:117] "RemoveContainer" containerID="051c91ce806f209484255146a19b01c7d80aaa7be707eec61387568ea4a94657" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.833485 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l7pm4"] Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.838662 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-q587x" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.850590 4936 scope.go:117] "RemoveContainer" containerID="fbf822e670a893042e4f185c7fd60e2d1ddff97589ab9dca1e4703408f15af05" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.852176 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbjhk"] Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.853560 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbjhk"] Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.870266 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7jxz9"] Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.874393 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7jxz9"] Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.880865 4936 scope.go:117] "RemoveContainer" containerID="8daf1693a5932fe145210cd7ec8209aeb27b7b487d521e7fe1388715836e4e26" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.890972 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hsdwj"] Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.895391 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hsdwj"] Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.897588 4936 scope.go:117] "RemoveContainer" containerID="6a29ca35f743dd7c827c3edc866cd74f8c43466954b9ed2c7c264993b8f282d2" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.911125 4936 scope.go:117] "RemoveContainer" containerID="ba52a81f758ba13d5f8d90355419733133eed7f688ea5a462b837402ec430c0b" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.915646 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-q587x" podStartSLOduration=1.915632016 podStartE2EDuration="1.915632016s" podCreationTimestamp="2025-09-30 13:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:43:26.913647752 +0000 UTC m=+257.297650053" watchObservedRunningTime="2025-09-30 13:43:26.915632016 +0000 UTC m=+257.299634317" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.926578 4936 scope.go:117] "RemoveContainer" containerID="b98da817a156fed52b39e0a27d247a657e21b1ad0323b749198918d6e84fde95" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.937500 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m6dxk"] Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.945750 4936 scope.go:117] "RemoveContainer" containerID="fd114c3e587a0fd263ff933164e6e700bc0ca89b5bf33d8428a2850f57f78d83" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.957132 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m6dxk"] Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.963153 4936 scope.go:117] "RemoveContainer" containerID="bd8f1fcc3d6501322d0c4b13ff63fa918ac442d6ef2b956dc123069785fcfb67" Sep 30 13:43:26 crc kubenswrapper[4936]: I0930 13:43:26.986233 4936 scope.go:117] "RemoveContainer" containerID="2c23346b2e3a0886ed59e890c9ee85f6d9dfda3cd1941f6adc44d555dae4132f" Sep 30 13:43:27 crc kubenswrapper[4936]: I0930 13:43:27.023637 4936 scope.go:117] "RemoveContainer" containerID="75ecf99a02c44e96af75f154aa05f953863a2a9ac08ac5f783a2b1f6b904a008" Sep 30 13:43:27 crc kubenswrapper[4936]: I0930 13:43:27.038479 4936 scope.go:117] "RemoveContainer" containerID="0a3c2f0305bb69c3d870c0ca91b976ab06c3e655653c8c978d2a298a68abbac3" Sep 30 13:43:27 crc kubenswrapper[4936]: I0930 13:43:27.051429 4936 scope.go:117] "RemoveContainer" containerID="9325eeeb8e24420eb1f6273dd5dadcd0ff0fcf014d1bf5960bf1f7e64aba2f13" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.275062 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f4nkx"] Sep 30 13:43:28 crc kubenswrapper[4936]: E0930 13:43:28.275243 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d411f6-43fa-4ee8-a820-b5f77a3d94ac" containerName="extract-content" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.275253 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d411f6-43fa-4ee8-a820-b5f77a3d94ac" containerName="extract-content" Sep 30 13:43:28 crc kubenswrapper[4936]: E0930 13:43:28.275266 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba953910-a090-4e61-88f2-f1718da34ce3" containerName="extract-utilities" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.275272 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba953910-a090-4e61-88f2-f1718da34ce3" containerName="extract-utilities" Sep 30 13:43:28 crc kubenswrapper[4936]: E0930 13:43:28.275282 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea297dc-9577-48fb-b7b4-e1731a2d44d7" containerName="extract-content" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.275288 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea297dc-9577-48fb-b7b4-e1731a2d44d7" containerName="extract-content" Sep 30 13:43:28 crc kubenswrapper[4936]: E0930 13:43:28.275296 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d411f6-43fa-4ee8-a820-b5f77a3d94ac" containerName="registry-server" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.275301 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d411f6-43fa-4ee8-a820-b5f77a3d94ac" containerName="registry-server" Sep 30 13:43:28 crc kubenswrapper[4936]: E0930 13:43:28.275310 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba953910-a090-4e61-88f2-f1718da34ce3" containerName="extract-content" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.275316 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba953910-a090-4e61-88f2-f1718da34ce3" containerName="extract-content" Sep 30 13:43:28 crc kubenswrapper[4936]: E0930 13:43:28.275325 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ebbf1be-a135-4aab-8f09-5d77fc1b1a60" containerName="registry-server" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.275346 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ebbf1be-a135-4aab-8f09-5d77fc1b1a60" containerName="registry-server" Sep 30 13:43:28 crc kubenswrapper[4936]: E0930 13:43:28.275353 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d411f6-43fa-4ee8-a820-b5f77a3d94ac" containerName="extract-utilities" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.275358 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d411f6-43fa-4ee8-a820-b5f77a3d94ac" containerName="extract-utilities" Sep 30 13:43:28 crc kubenswrapper[4936]: E0930 13:43:28.275366 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea297dc-9577-48fb-b7b4-e1731a2d44d7" containerName="registry-server" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.275372 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea297dc-9577-48fb-b7b4-e1731a2d44d7" containerName="registry-server" Sep 30 13:43:28 crc kubenswrapper[4936]: E0930 13:43:28.275378 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ebbf1be-a135-4aab-8f09-5d77fc1b1a60" containerName="extract-content" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.275383 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ebbf1be-a135-4aab-8f09-5d77fc1b1a60" containerName="extract-content" Sep 30 13:43:28 crc kubenswrapper[4936]: E0930 13:43:28.275389 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba953910-a090-4e61-88f2-f1718da34ce3" containerName="registry-server" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.275394 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba953910-a090-4e61-88f2-f1718da34ce3" containerName="registry-server" Sep 30 13:43:28 crc kubenswrapper[4936]: E0930 13:43:28.275403 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea297dc-9577-48fb-b7b4-e1731a2d44d7" containerName="extract-utilities" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.275408 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea297dc-9577-48fb-b7b4-e1731a2d44d7" containerName="extract-utilities" Sep 30 13:43:28 crc kubenswrapper[4936]: E0930 13:43:28.275414 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ebbf1be-a135-4aab-8f09-5d77fc1b1a60" containerName="extract-utilities" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.275422 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ebbf1be-a135-4aab-8f09-5d77fc1b1a60" containerName="extract-utilities" Sep 30 13:43:28 crc kubenswrapper[4936]: E0930 13:43:28.275431 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f" containerName="marketplace-operator" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.275437 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f" containerName="marketplace-operator" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.275512 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ebbf1be-a135-4aab-8f09-5d77fc1b1a60" containerName="registry-server" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.275523 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea297dc-9577-48fb-b7b4-e1731a2d44d7" containerName="registry-server" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.275534 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f" containerName="marketplace-operator" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.275543 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba953910-a090-4e61-88f2-f1718da34ce3" containerName="registry-server" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.275551 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d411f6-43fa-4ee8-a820-b5f77a3d94ac" containerName="registry-server" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.276280 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f4nkx" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.280310 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.291012 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4nkx"] Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.321998 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25d411f6-43fa-4ee8-a820-b5f77a3d94ac" path="/var/lib/kubelet/pods/25d411f6-43fa-4ee8-a820-b5f77a3d94ac/volumes" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.322878 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ebbf1be-a135-4aab-8f09-5d77fc1b1a60" path="/var/lib/kubelet/pods/4ebbf1be-a135-4aab-8f09-5d77fc1b1a60/volumes" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.323677 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f" path="/var/lib/kubelet/pods/a3dc833d-9242-4dc6-ad6a-e77b1ab61a9f/volumes" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.324681 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba953910-a090-4e61-88f2-f1718da34ce3" path="/var/lib/kubelet/pods/ba953910-a090-4e61-88f2-f1718da34ce3/volumes" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.325262 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eea297dc-9577-48fb-b7b4-e1731a2d44d7" path="/var/lib/kubelet/pods/eea297dc-9577-48fb-b7b4-e1731a2d44d7/volumes" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.377876 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e48746c3-7005-4672-a536-f6b419f168fc-utilities\") pod \"redhat-marketplace-f4nkx\" (UID: \"e48746c3-7005-4672-a536-f6b419f168fc\") " pod="openshift-marketplace/redhat-marketplace-f4nkx" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.377957 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e48746c3-7005-4672-a536-f6b419f168fc-catalog-content\") pod \"redhat-marketplace-f4nkx\" (UID: \"e48746c3-7005-4672-a536-f6b419f168fc\") " pod="openshift-marketplace/redhat-marketplace-f4nkx" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.378017 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdn8t\" (UniqueName: \"kubernetes.io/projected/e48746c3-7005-4672-a536-f6b419f168fc-kube-api-access-gdn8t\") pod \"redhat-marketplace-f4nkx\" (UID: \"e48746c3-7005-4672-a536-f6b419f168fc\") " pod="openshift-marketplace/redhat-marketplace-f4nkx" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.478705 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdn8t\" (UniqueName: \"kubernetes.io/projected/e48746c3-7005-4672-a536-f6b419f168fc-kube-api-access-gdn8t\") pod \"redhat-marketplace-f4nkx\" (UID: \"e48746c3-7005-4672-a536-f6b419f168fc\") " pod="openshift-marketplace/redhat-marketplace-f4nkx" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.478846 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e48746c3-7005-4672-a536-f6b419f168fc-utilities\") pod \"redhat-marketplace-f4nkx\" (UID: \"e48746c3-7005-4672-a536-f6b419f168fc\") " pod="openshift-marketplace/redhat-marketplace-f4nkx" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.478874 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e48746c3-7005-4672-a536-f6b419f168fc-catalog-content\") pod \"redhat-marketplace-f4nkx\" (UID: \"e48746c3-7005-4672-a536-f6b419f168fc\") " pod="openshift-marketplace/redhat-marketplace-f4nkx" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.479283 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e48746c3-7005-4672-a536-f6b419f168fc-catalog-content\") pod \"redhat-marketplace-f4nkx\" (UID: \"e48746c3-7005-4672-a536-f6b419f168fc\") " pod="openshift-marketplace/redhat-marketplace-f4nkx" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.480939 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e48746c3-7005-4672-a536-f6b419f168fc-utilities\") pod \"redhat-marketplace-f4nkx\" (UID: \"e48746c3-7005-4672-a536-f6b419f168fc\") " pod="openshift-marketplace/redhat-marketplace-f4nkx" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.502373 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdn8t\" (UniqueName: \"kubernetes.io/projected/e48746c3-7005-4672-a536-f6b419f168fc-kube-api-access-gdn8t\") pod \"redhat-marketplace-f4nkx\" (UID: \"e48746c3-7005-4672-a536-f6b419f168fc\") " pod="openshift-marketplace/redhat-marketplace-f4nkx" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.597862 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f4nkx" Sep 30 13:43:28 crc kubenswrapper[4936]: I0930 13:43:28.984846 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f4nkx"] Sep 30 13:43:29 crc kubenswrapper[4936]: W0930 13:43:29.001098 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode48746c3_7005_4672_a536_f6b419f168fc.slice/crio-4c1c0b8d14cca7b5963d6f9d1b59546335fe2f1b091b456235a28ccde9ba55b6 WatchSource:0}: Error finding container 4c1c0b8d14cca7b5963d6f9d1b59546335fe2f1b091b456235a28ccde9ba55b6: Status 404 returned error can't find the container with id 4c1c0b8d14cca7b5963d6f9d1b59546335fe2f1b091b456235a28ccde9ba55b6 Sep 30 13:43:29 crc kubenswrapper[4936]: I0930 13:43:29.277235 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bfp5r"] Sep 30 13:43:29 crc kubenswrapper[4936]: I0930 13:43:29.278544 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bfp5r" Sep 30 13:43:29 crc kubenswrapper[4936]: I0930 13:43:29.280840 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 13:43:29 crc kubenswrapper[4936]: I0930 13:43:29.289697 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bfp5r"] Sep 30 13:43:29 crc kubenswrapper[4936]: I0930 13:43:29.389062 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b315e18d-a8cb-41cc-8626-403e2204e403-utilities\") pod \"redhat-operators-bfp5r\" (UID: \"b315e18d-a8cb-41cc-8626-403e2204e403\") " pod="openshift-marketplace/redhat-operators-bfp5r" Sep 30 13:43:29 crc kubenswrapper[4936]: I0930 13:43:29.389121 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b315e18d-a8cb-41cc-8626-403e2204e403-catalog-content\") pod \"redhat-operators-bfp5r\" (UID: \"b315e18d-a8cb-41cc-8626-403e2204e403\") " pod="openshift-marketplace/redhat-operators-bfp5r" Sep 30 13:43:29 crc kubenswrapper[4936]: I0930 13:43:29.389167 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts4zk\" (UniqueName: \"kubernetes.io/projected/b315e18d-a8cb-41cc-8626-403e2204e403-kube-api-access-ts4zk\") pod \"redhat-operators-bfp5r\" (UID: \"b315e18d-a8cb-41cc-8626-403e2204e403\") " pod="openshift-marketplace/redhat-operators-bfp5r" Sep 30 13:43:29 crc kubenswrapper[4936]: I0930 13:43:29.490475 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b315e18d-a8cb-41cc-8626-403e2204e403-utilities\") pod \"redhat-operators-bfp5r\" (UID: \"b315e18d-a8cb-41cc-8626-403e2204e403\") " pod="openshift-marketplace/redhat-operators-bfp5r" Sep 30 13:43:29 crc kubenswrapper[4936]: I0930 13:43:29.490516 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b315e18d-a8cb-41cc-8626-403e2204e403-catalog-content\") pod \"redhat-operators-bfp5r\" (UID: \"b315e18d-a8cb-41cc-8626-403e2204e403\") " pod="openshift-marketplace/redhat-operators-bfp5r" Sep 30 13:43:29 crc kubenswrapper[4936]: I0930 13:43:29.490549 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts4zk\" (UniqueName: \"kubernetes.io/projected/b315e18d-a8cb-41cc-8626-403e2204e403-kube-api-access-ts4zk\") pod \"redhat-operators-bfp5r\" (UID: \"b315e18d-a8cb-41cc-8626-403e2204e403\") " pod="openshift-marketplace/redhat-operators-bfp5r" Sep 30 13:43:29 crc kubenswrapper[4936]: I0930 13:43:29.491011 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b315e18d-a8cb-41cc-8626-403e2204e403-utilities\") pod \"redhat-operators-bfp5r\" (UID: \"b315e18d-a8cb-41cc-8626-403e2204e403\") " pod="openshift-marketplace/redhat-operators-bfp5r" Sep 30 13:43:29 crc kubenswrapper[4936]: I0930 13:43:29.491131 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b315e18d-a8cb-41cc-8626-403e2204e403-catalog-content\") pod \"redhat-operators-bfp5r\" (UID: \"b315e18d-a8cb-41cc-8626-403e2204e403\") " pod="openshift-marketplace/redhat-operators-bfp5r" Sep 30 13:43:29 crc kubenswrapper[4936]: I0930 13:43:29.506995 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts4zk\" (UniqueName: \"kubernetes.io/projected/b315e18d-a8cb-41cc-8626-403e2204e403-kube-api-access-ts4zk\") pod \"redhat-operators-bfp5r\" (UID: \"b315e18d-a8cb-41cc-8626-403e2204e403\") " pod="openshift-marketplace/redhat-operators-bfp5r" Sep 30 13:43:29 crc kubenswrapper[4936]: I0930 13:43:29.597708 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bfp5r" Sep 30 13:43:29 crc kubenswrapper[4936]: I0930 13:43:29.851733 4936 generic.go:334] "Generic (PLEG): container finished" podID="e48746c3-7005-4672-a536-f6b419f168fc" containerID="f1ea9f78704d86321f7766f5cca68b96fc8c9f1ed8929dbe9f40f8f440ddf76c" exitCode=0 Sep 30 13:43:29 crc kubenswrapper[4936]: I0930 13:43:29.851777 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4nkx" event={"ID":"e48746c3-7005-4672-a536-f6b419f168fc","Type":"ContainerDied","Data":"f1ea9f78704d86321f7766f5cca68b96fc8c9f1ed8929dbe9f40f8f440ddf76c"} Sep 30 13:43:29 crc kubenswrapper[4936]: I0930 13:43:29.851815 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4nkx" event={"ID":"e48746c3-7005-4672-a536-f6b419f168fc","Type":"ContainerStarted","Data":"4c1c0b8d14cca7b5963d6f9d1b59546335fe2f1b091b456235a28ccde9ba55b6"} Sep 30 13:43:30 crc kubenswrapper[4936]: I0930 13:43:30.001802 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bfp5r"] Sep 30 13:43:30 crc kubenswrapper[4936]: W0930 13:43:30.007502 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb315e18d_a8cb_41cc_8626_403e2204e403.slice/crio-ee2fff6044568d75132cd04e916aefcc320754b005ef47494faf23ee7c113f48 WatchSource:0}: Error finding container ee2fff6044568d75132cd04e916aefcc320754b005ef47494faf23ee7c113f48: Status 404 returned error can't find the container with id ee2fff6044568d75132cd04e916aefcc320754b005ef47494faf23ee7c113f48 Sep 30 13:43:30 crc kubenswrapper[4936]: I0930 13:43:30.680680 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kljrf"] Sep 30 13:43:30 crc kubenswrapper[4936]: I0930 13:43:30.684811 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kljrf" Sep 30 13:43:30 crc kubenswrapper[4936]: I0930 13:43:30.690170 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 13:43:30 crc kubenswrapper[4936]: I0930 13:43:30.694547 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kljrf"] Sep 30 13:43:30 crc kubenswrapper[4936]: I0930 13:43:30.804902 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7tw4\" (UniqueName: \"kubernetes.io/projected/5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0-kube-api-access-p7tw4\") pod \"community-operators-kljrf\" (UID: \"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0\") " pod="openshift-marketplace/community-operators-kljrf" Sep 30 13:43:30 crc kubenswrapper[4936]: I0930 13:43:30.804954 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0-utilities\") pod \"community-operators-kljrf\" (UID: \"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0\") " pod="openshift-marketplace/community-operators-kljrf" Sep 30 13:43:30 crc kubenswrapper[4936]: I0930 13:43:30.804983 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0-catalog-content\") pod \"community-operators-kljrf\" (UID: \"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0\") " pod="openshift-marketplace/community-operators-kljrf" Sep 30 13:43:30 crc kubenswrapper[4936]: I0930 13:43:30.858029 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4nkx" event={"ID":"e48746c3-7005-4672-a536-f6b419f168fc","Type":"ContainerStarted","Data":"0a1f45c10446df489e06cab00831fc40c74d725511f4ee57418d5b752d2a818b"} Sep 30 13:43:30 crc kubenswrapper[4936]: I0930 13:43:30.860166 4936 generic.go:334] "Generic (PLEG): container finished" podID="b315e18d-a8cb-41cc-8626-403e2204e403" containerID="8bfab8286954eb633df503dbf1f0c491cb2df21a59ec0bbfd2036aa3e86cd674" exitCode=0 Sep 30 13:43:30 crc kubenswrapper[4936]: I0930 13:43:30.860198 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bfp5r" event={"ID":"b315e18d-a8cb-41cc-8626-403e2204e403","Type":"ContainerDied","Data":"8bfab8286954eb633df503dbf1f0c491cb2df21a59ec0bbfd2036aa3e86cd674"} Sep 30 13:43:30 crc kubenswrapper[4936]: I0930 13:43:30.860213 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bfp5r" event={"ID":"b315e18d-a8cb-41cc-8626-403e2204e403","Type":"ContainerStarted","Data":"ee2fff6044568d75132cd04e916aefcc320754b005ef47494faf23ee7c113f48"} Sep 30 13:43:30 crc kubenswrapper[4936]: I0930 13:43:30.906498 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0-utilities\") pod \"community-operators-kljrf\" (UID: \"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0\") " pod="openshift-marketplace/community-operators-kljrf" Sep 30 13:43:30 crc kubenswrapper[4936]: I0930 13:43:30.906558 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7tw4\" (UniqueName: \"kubernetes.io/projected/5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0-kube-api-access-p7tw4\") pod \"community-operators-kljrf\" (UID: \"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0\") " pod="openshift-marketplace/community-operators-kljrf" Sep 30 13:43:30 crc kubenswrapper[4936]: I0930 13:43:30.906592 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0-catalog-content\") pod \"community-operators-kljrf\" (UID: \"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0\") " pod="openshift-marketplace/community-operators-kljrf" Sep 30 13:43:30 crc kubenswrapper[4936]: I0930 13:43:30.906968 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0-utilities\") pod \"community-operators-kljrf\" (UID: \"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0\") " pod="openshift-marketplace/community-operators-kljrf" Sep 30 13:43:30 crc kubenswrapper[4936]: I0930 13:43:30.907125 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0-catalog-content\") pod \"community-operators-kljrf\" (UID: \"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0\") " pod="openshift-marketplace/community-operators-kljrf" Sep 30 13:43:30 crc kubenswrapper[4936]: I0930 13:43:30.924390 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7tw4\" (UniqueName: \"kubernetes.io/projected/5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0-kube-api-access-p7tw4\") pod \"community-operators-kljrf\" (UID: \"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0\") " pod="openshift-marketplace/community-operators-kljrf" Sep 30 13:43:31 crc kubenswrapper[4936]: I0930 13:43:31.011637 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kljrf" Sep 30 13:43:31 crc kubenswrapper[4936]: I0930 13:43:31.407828 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kljrf"] Sep 30 13:43:31 crc kubenswrapper[4936]: W0930 13:43:31.414069 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c4a0dbd_4ec7_4144_8c67_19b0716d0ed0.slice/crio-8665774d385ff82dd7ff7e92fe50e34216907b89f3ef330212f2911605681621 WatchSource:0}: Error finding container 8665774d385ff82dd7ff7e92fe50e34216907b89f3ef330212f2911605681621: Status 404 returned error can't find the container with id 8665774d385ff82dd7ff7e92fe50e34216907b89f3ef330212f2911605681621 Sep 30 13:43:31 crc kubenswrapper[4936]: I0930 13:43:31.681296 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f6hgz"] Sep 30 13:43:31 crc kubenswrapper[4936]: I0930 13:43:31.682951 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f6hgz" Sep 30 13:43:31 crc kubenswrapper[4936]: I0930 13:43:31.684930 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 13:43:31 crc kubenswrapper[4936]: I0930 13:43:31.688159 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f6hgz"] Sep 30 13:43:31 crc kubenswrapper[4936]: I0930 13:43:31.818908 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7rwj\" (UniqueName: \"kubernetes.io/projected/2ca06e96-23c1-4b90-a643-3e36b8df9443-kube-api-access-p7rwj\") pod \"certified-operators-f6hgz\" (UID: \"2ca06e96-23c1-4b90-a643-3e36b8df9443\") " pod="openshift-marketplace/certified-operators-f6hgz" Sep 30 13:43:31 crc kubenswrapper[4936]: I0930 13:43:31.818963 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ca06e96-23c1-4b90-a643-3e36b8df9443-catalog-content\") pod \"certified-operators-f6hgz\" (UID: \"2ca06e96-23c1-4b90-a643-3e36b8df9443\") " pod="openshift-marketplace/certified-operators-f6hgz" Sep 30 13:43:31 crc kubenswrapper[4936]: I0930 13:43:31.819000 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ca06e96-23c1-4b90-a643-3e36b8df9443-utilities\") pod \"certified-operators-f6hgz\" (UID: \"2ca06e96-23c1-4b90-a643-3e36b8df9443\") " pod="openshift-marketplace/certified-operators-f6hgz" Sep 30 13:43:31 crc kubenswrapper[4936]: I0930 13:43:31.867110 4936 generic.go:334] "Generic (PLEG): container finished" podID="e48746c3-7005-4672-a536-f6b419f168fc" containerID="0a1f45c10446df489e06cab00831fc40c74d725511f4ee57418d5b752d2a818b" exitCode=0 Sep 30 13:43:31 crc kubenswrapper[4936]: I0930 13:43:31.867180 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4nkx" event={"ID":"e48746c3-7005-4672-a536-f6b419f168fc","Type":"ContainerDied","Data":"0a1f45c10446df489e06cab00831fc40c74d725511f4ee57418d5b752d2a818b"} Sep 30 13:43:31 crc kubenswrapper[4936]: I0930 13:43:31.869473 4936 generic.go:334] "Generic (PLEG): container finished" podID="5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0" containerID="90d9e4f63d26d1dc1d912a144aa3a2531c9f4917d7e9d55c5898a84dc85e35c5" exitCode=0 Sep 30 13:43:31 crc kubenswrapper[4936]: I0930 13:43:31.869516 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kljrf" event={"ID":"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0","Type":"ContainerDied","Data":"90d9e4f63d26d1dc1d912a144aa3a2531c9f4917d7e9d55c5898a84dc85e35c5"} Sep 30 13:43:31 crc kubenswrapper[4936]: I0930 13:43:31.869547 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kljrf" event={"ID":"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0","Type":"ContainerStarted","Data":"8665774d385ff82dd7ff7e92fe50e34216907b89f3ef330212f2911605681621"} Sep 30 13:43:31 crc kubenswrapper[4936]: I0930 13:43:31.919780 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7rwj\" (UniqueName: \"kubernetes.io/projected/2ca06e96-23c1-4b90-a643-3e36b8df9443-kube-api-access-p7rwj\") pod \"certified-operators-f6hgz\" (UID: \"2ca06e96-23c1-4b90-a643-3e36b8df9443\") " pod="openshift-marketplace/certified-operators-f6hgz" Sep 30 13:43:31 crc kubenswrapper[4936]: I0930 13:43:31.919869 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ca06e96-23c1-4b90-a643-3e36b8df9443-catalog-content\") pod \"certified-operators-f6hgz\" (UID: \"2ca06e96-23c1-4b90-a643-3e36b8df9443\") " pod="openshift-marketplace/certified-operators-f6hgz" Sep 30 13:43:31 crc kubenswrapper[4936]: I0930 13:43:31.920506 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ca06e96-23c1-4b90-a643-3e36b8df9443-catalog-content\") pod \"certified-operators-f6hgz\" (UID: \"2ca06e96-23c1-4b90-a643-3e36b8df9443\") " pod="openshift-marketplace/certified-operators-f6hgz" Sep 30 13:43:31 crc kubenswrapper[4936]: I0930 13:43:31.920601 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ca06e96-23c1-4b90-a643-3e36b8df9443-utilities\") pod \"certified-operators-f6hgz\" (UID: \"2ca06e96-23c1-4b90-a643-3e36b8df9443\") " pod="openshift-marketplace/certified-operators-f6hgz" Sep 30 13:43:31 crc kubenswrapper[4936]: I0930 13:43:31.920951 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ca06e96-23c1-4b90-a643-3e36b8df9443-utilities\") pod \"certified-operators-f6hgz\" (UID: \"2ca06e96-23c1-4b90-a643-3e36b8df9443\") " pod="openshift-marketplace/certified-operators-f6hgz" Sep 30 13:43:31 crc kubenswrapper[4936]: I0930 13:43:31.939184 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7rwj\" (UniqueName: \"kubernetes.io/projected/2ca06e96-23c1-4b90-a643-3e36b8df9443-kube-api-access-p7rwj\") pod \"certified-operators-f6hgz\" (UID: \"2ca06e96-23c1-4b90-a643-3e36b8df9443\") " pod="openshift-marketplace/certified-operators-f6hgz" Sep 30 13:43:32 crc kubenswrapper[4936]: I0930 13:43:32.002464 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f6hgz" Sep 30 13:43:32 crc kubenswrapper[4936]: I0930 13:43:32.359757 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f6hgz"] Sep 30 13:43:32 crc kubenswrapper[4936]: I0930 13:43:32.875515 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f4nkx" event={"ID":"e48746c3-7005-4672-a536-f6b419f168fc","Type":"ContainerStarted","Data":"9919d8c6f7877a1f627304b9e7ac237d1db2fae70342079c3e4a2c2f1fffed49"} Sep 30 13:43:32 crc kubenswrapper[4936]: I0930 13:43:32.877821 4936 generic.go:334] "Generic (PLEG): container finished" podID="2ca06e96-23c1-4b90-a643-3e36b8df9443" containerID="22846e52ecd13f5f5215b31606be3c0a77de3ffc95acafd8d1ac40ae93bf53b2" exitCode=0 Sep 30 13:43:32 crc kubenswrapper[4936]: I0930 13:43:32.877884 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6hgz" event={"ID":"2ca06e96-23c1-4b90-a643-3e36b8df9443","Type":"ContainerDied","Data":"22846e52ecd13f5f5215b31606be3c0a77de3ffc95acafd8d1ac40ae93bf53b2"} Sep 30 13:43:32 crc kubenswrapper[4936]: I0930 13:43:32.877901 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6hgz" event={"ID":"2ca06e96-23c1-4b90-a643-3e36b8df9443","Type":"ContainerStarted","Data":"6dcc2310593e5cbb39e1d3424560070d81be2dc782ef8ad513b46a6bf5c3103d"} Sep 30 13:43:32 crc kubenswrapper[4936]: I0930 13:43:32.880178 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bfp5r" event={"ID":"b315e18d-a8cb-41cc-8626-403e2204e403","Type":"ContainerStarted","Data":"2b8af11539896ff178382393aeef37a5f61809def80a7950a98092628ed8eb66"} Sep 30 13:43:32 crc kubenswrapper[4936]: I0930 13:43:32.892929 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f4nkx" podStartSLOduration=2.334102115 podStartE2EDuration="4.892912156s" podCreationTimestamp="2025-09-30 13:43:28 +0000 UTC" firstStartedPulling="2025-09-30 13:43:29.853038369 +0000 UTC m=+260.237040670" lastFinishedPulling="2025-09-30 13:43:32.41184841 +0000 UTC m=+262.795850711" observedRunningTime="2025-09-30 13:43:32.892005271 +0000 UTC m=+263.276007592" watchObservedRunningTime="2025-09-30 13:43:32.892912156 +0000 UTC m=+263.276914457" Sep 30 13:43:33 crc kubenswrapper[4936]: I0930 13:43:33.886876 4936 generic.go:334] "Generic (PLEG): container finished" podID="5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0" containerID="f4ab1057acf7e8f7f3dd93fff43106dcad31aab29148ee599f4d8bad67184170" exitCode=0 Sep 30 13:43:33 crc kubenswrapper[4936]: I0930 13:43:33.887276 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kljrf" event={"ID":"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0","Type":"ContainerDied","Data":"f4ab1057acf7e8f7f3dd93fff43106dcad31aab29148ee599f4d8bad67184170"} Sep 30 13:43:33 crc kubenswrapper[4936]: I0930 13:43:33.889988 4936 generic.go:334] "Generic (PLEG): container finished" podID="b315e18d-a8cb-41cc-8626-403e2204e403" containerID="2b8af11539896ff178382393aeef37a5f61809def80a7950a98092628ed8eb66" exitCode=0 Sep 30 13:43:33 crc kubenswrapper[4936]: I0930 13:43:33.890318 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bfp5r" event={"ID":"b315e18d-a8cb-41cc-8626-403e2204e403","Type":"ContainerDied","Data":"2b8af11539896ff178382393aeef37a5f61809def80a7950a98092628ed8eb66"} Sep 30 13:43:35 crc kubenswrapper[4936]: I0930 13:43:35.902226 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bfp5r" event={"ID":"b315e18d-a8cb-41cc-8626-403e2204e403","Type":"ContainerStarted","Data":"a9e29143454a1ed872a47ad4becc9719cb35d06e261dfc398162b8d0bfbbb140"} Sep 30 13:43:35 crc kubenswrapper[4936]: I0930 13:43:35.904380 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kljrf" event={"ID":"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0","Type":"ContainerStarted","Data":"08aaed2abf42edbc93f1633b3174930379ed5c15d7827c5159ecf31a45f1dfd2"} Sep 30 13:43:35 crc kubenswrapper[4936]: I0930 13:43:35.907379 4936 generic.go:334] "Generic (PLEG): container finished" podID="2ca06e96-23c1-4b90-a643-3e36b8df9443" containerID="155283163c1d54c4dcf2ccaa9ab5a9508bd2fb6d094cd2a523789ce57ecbbf04" exitCode=0 Sep 30 13:43:35 crc kubenswrapper[4936]: I0930 13:43:35.907531 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6hgz" event={"ID":"2ca06e96-23c1-4b90-a643-3e36b8df9443","Type":"ContainerDied","Data":"155283163c1d54c4dcf2ccaa9ab5a9508bd2fb6d094cd2a523789ce57ecbbf04"} Sep 30 13:43:35 crc kubenswrapper[4936]: I0930 13:43:35.925969 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bfp5r" podStartSLOduration=3.325307755 podStartE2EDuration="6.925953557s" podCreationTimestamp="2025-09-30 13:43:29 +0000 UTC" firstStartedPulling="2025-09-30 13:43:30.862059092 +0000 UTC m=+261.246061393" lastFinishedPulling="2025-09-30 13:43:34.462704894 +0000 UTC m=+264.846707195" observedRunningTime="2025-09-30 13:43:35.923478056 +0000 UTC m=+266.307480367" watchObservedRunningTime="2025-09-30 13:43:35.925953557 +0000 UTC m=+266.309955858" Sep 30 13:43:35 crc kubenswrapper[4936]: I0930 13:43:35.957310 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kljrf" podStartSLOduration=3.49954521 podStartE2EDuration="5.957293789s" podCreationTimestamp="2025-09-30 13:43:30 +0000 UTC" firstStartedPulling="2025-09-30 13:43:31.870819945 +0000 UTC m=+262.254822256" lastFinishedPulling="2025-09-30 13:43:34.328568534 +0000 UTC m=+264.712570835" observedRunningTime="2025-09-30 13:43:35.955007124 +0000 UTC m=+266.339009425" watchObservedRunningTime="2025-09-30 13:43:35.957293789 +0000 UTC m=+266.341296090" Sep 30 13:43:36 crc kubenswrapper[4936]: I0930 13:43:36.914186 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6hgz" event={"ID":"2ca06e96-23c1-4b90-a643-3e36b8df9443","Type":"ContainerStarted","Data":"371c03ba13cc9b045b8985421e4fab046a18d170f1ba02b495ab807302353606"} Sep 30 13:43:36 crc kubenswrapper[4936]: I0930 13:43:36.932827 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f6hgz" podStartSLOduration=2.465419627 podStartE2EDuration="5.932808844s" podCreationTimestamp="2025-09-30 13:43:31 +0000 UTC" firstStartedPulling="2025-09-30 13:43:32.879443913 +0000 UTC m=+263.263446214" lastFinishedPulling="2025-09-30 13:43:36.34683313 +0000 UTC m=+266.730835431" observedRunningTime="2025-09-30 13:43:36.932553667 +0000 UTC m=+267.316555968" watchObservedRunningTime="2025-09-30 13:43:36.932808844 +0000 UTC m=+267.316811145" Sep 30 13:43:38 crc kubenswrapper[4936]: I0930 13:43:38.598324 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f4nkx" Sep 30 13:43:38 crc kubenswrapper[4936]: I0930 13:43:38.598398 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f4nkx" Sep 30 13:43:38 crc kubenswrapper[4936]: I0930 13:43:38.634386 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f4nkx" Sep 30 13:43:38 crc kubenswrapper[4936]: I0930 13:43:38.959428 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f4nkx" Sep 30 13:43:39 crc kubenswrapper[4936]: I0930 13:43:39.598720 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bfp5r" Sep 30 13:43:39 crc kubenswrapper[4936]: I0930 13:43:39.599022 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bfp5r" Sep 30 13:43:40 crc kubenswrapper[4936]: I0930 13:43:40.635361 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bfp5r" podUID="b315e18d-a8cb-41cc-8626-403e2204e403" containerName="registry-server" probeResult="failure" output=< Sep 30 13:43:40 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 13:43:40 crc kubenswrapper[4936]: > Sep 30 13:43:41 crc kubenswrapper[4936]: I0930 13:43:41.012137 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kljrf" Sep 30 13:43:41 crc kubenswrapper[4936]: I0930 13:43:41.012182 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kljrf" Sep 30 13:43:41 crc kubenswrapper[4936]: I0930 13:43:41.054745 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kljrf" Sep 30 13:43:41 crc kubenswrapper[4936]: I0930 13:43:41.973764 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kljrf" Sep 30 13:43:42 crc kubenswrapper[4936]: I0930 13:43:42.003485 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f6hgz" Sep 30 13:43:42 crc kubenswrapper[4936]: I0930 13:43:42.003552 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f6hgz" Sep 30 13:43:42 crc kubenswrapper[4936]: I0930 13:43:42.036923 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f6hgz" Sep 30 13:43:42 crc kubenswrapper[4936]: I0930 13:43:42.992720 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f6hgz" Sep 30 13:43:49 crc kubenswrapper[4936]: I0930 13:43:49.634671 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bfp5r" Sep 30 13:43:49 crc kubenswrapper[4936]: I0930 13:43:49.677213 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bfp5r" Sep 30 13:44:48 crc kubenswrapper[4936]: I0930 13:44:48.250035 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:44:48 crc kubenswrapper[4936]: I0930 13:44:48.250811 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:45:00 crc kubenswrapper[4936]: I0930 13:45:00.144071 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk"] Sep 30 13:45:00 crc kubenswrapper[4936]: I0930 13:45:00.145969 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk" Sep 30 13:45:00 crc kubenswrapper[4936]: I0930 13:45:00.147899 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 13:45:00 crc kubenswrapper[4936]: I0930 13:45:00.149606 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk"] Sep 30 13:45:00 crc kubenswrapper[4936]: I0930 13:45:00.150037 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 13:45:00 crc kubenswrapper[4936]: I0930 13:45:00.311304 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w475b\" (UniqueName: \"kubernetes.io/projected/0b33ff50-1265-41e6-9b6a-d526726f71cb-kube-api-access-w475b\") pod \"collect-profiles-29320665-xjlsk\" (UID: \"0b33ff50-1265-41e6-9b6a-d526726f71cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk" Sep 30 13:45:00 crc kubenswrapper[4936]: I0930 13:45:00.311367 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b33ff50-1265-41e6-9b6a-d526726f71cb-secret-volume\") pod \"collect-profiles-29320665-xjlsk\" (UID: \"0b33ff50-1265-41e6-9b6a-d526726f71cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk" Sep 30 13:45:00 crc kubenswrapper[4936]: I0930 13:45:00.311394 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b33ff50-1265-41e6-9b6a-d526726f71cb-config-volume\") pod \"collect-profiles-29320665-xjlsk\" (UID: \"0b33ff50-1265-41e6-9b6a-d526726f71cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk" Sep 30 13:45:00 crc kubenswrapper[4936]: I0930 13:45:00.412586 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w475b\" (UniqueName: \"kubernetes.io/projected/0b33ff50-1265-41e6-9b6a-d526726f71cb-kube-api-access-w475b\") pod \"collect-profiles-29320665-xjlsk\" (UID: \"0b33ff50-1265-41e6-9b6a-d526726f71cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk" Sep 30 13:45:00 crc kubenswrapper[4936]: I0930 13:45:00.413193 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b33ff50-1265-41e6-9b6a-d526726f71cb-secret-volume\") pod \"collect-profiles-29320665-xjlsk\" (UID: \"0b33ff50-1265-41e6-9b6a-d526726f71cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk" Sep 30 13:45:00 crc kubenswrapper[4936]: I0930 13:45:00.413379 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b33ff50-1265-41e6-9b6a-d526726f71cb-config-volume\") pod \"collect-profiles-29320665-xjlsk\" (UID: \"0b33ff50-1265-41e6-9b6a-d526726f71cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk" Sep 30 13:45:00 crc kubenswrapper[4936]: I0930 13:45:00.414174 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b33ff50-1265-41e6-9b6a-d526726f71cb-config-volume\") pod \"collect-profiles-29320665-xjlsk\" (UID: \"0b33ff50-1265-41e6-9b6a-d526726f71cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk" Sep 30 13:45:00 crc kubenswrapper[4936]: I0930 13:45:00.424160 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b33ff50-1265-41e6-9b6a-d526726f71cb-secret-volume\") pod \"collect-profiles-29320665-xjlsk\" (UID: \"0b33ff50-1265-41e6-9b6a-d526726f71cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk" Sep 30 13:45:00 crc kubenswrapper[4936]: I0930 13:45:00.431829 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w475b\" (UniqueName: \"kubernetes.io/projected/0b33ff50-1265-41e6-9b6a-d526726f71cb-kube-api-access-w475b\") pod \"collect-profiles-29320665-xjlsk\" (UID: \"0b33ff50-1265-41e6-9b6a-d526726f71cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk" Sep 30 13:45:00 crc kubenswrapper[4936]: I0930 13:45:00.472991 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk" Sep 30 13:45:00 crc kubenswrapper[4936]: I0930 13:45:00.638837 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk"] Sep 30 13:45:01 crc kubenswrapper[4936]: I0930 13:45:01.350738 4936 generic.go:334] "Generic (PLEG): container finished" podID="0b33ff50-1265-41e6-9b6a-d526726f71cb" containerID="64480263703ac38729b088ebaca9d8ce6b5cdd547789790c6973bffb7cab0180" exitCode=0 Sep 30 13:45:01 crc kubenswrapper[4936]: I0930 13:45:01.350786 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk" event={"ID":"0b33ff50-1265-41e6-9b6a-d526726f71cb","Type":"ContainerDied","Data":"64480263703ac38729b088ebaca9d8ce6b5cdd547789790c6973bffb7cab0180"} Sep 30 13:45:01 crc kubenswrapper[4936]: I0930 13:45:01.350817 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk" event={"ID":"0b33ff50-1265-41e6-9b6a-d526726f71cb","Type":"ContainerStarted","Data":"30244d83065faa16bc87cb666159ce7c7e431c0847f4404bf3b0184a3a675a57"} Sep 30 13:45:02 crc kubenswrapper[4936]: I0930 13:45:02.526220 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk" Sep 30 13:45:02 crc kubenswrapper[4936]: I0930 13:45:02.642024 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b33ff50-1265-41e6-9b6a-d526726f71cb-config-volume\") pod \"0b33ff50-1265-41e6-9b6a-d526726f71cb\" (UID: \"0b33ff50-1265-41e6-9b6a-d526726f71cb\") " Sep 30 13:45:02 crc kubenswrapper[4936]: I0930 13:45:02.642111 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w475b\" (UniqueName: \"kubernetes.io/projected/0b33ff50-1265-41e6-9b6a-d526726f71cb-kube-api-access-w475b\") pod \"0b33ff50-1265-41e6-9b6a-d526726f71cb\" (UID: \"0b33ff50-1265-41e6-9b6a-d526726f71cb\") " Sep 30 13:45:02 crc kubenswrapper[4936]: I0930 13:45:02.642186 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b33ff50-1265-41e6-9b6a-d526726f71cb-secret-volume\") pod \"0b33ff50-1265-41e6-9b6a-d526726f71cb\" (UID: \"0b33ff50-1265-41e6-9b6a-d526726f71cb\") " Sep 30 13:45:02 crc kubenswrapper[4936]: I0930 13:45:02.643019 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b33ff50-1265-41e6-9b6a-d526726f71cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "0b33ff50-1265-41e6-9b6a-d526726f71cb" (UID: "0b33ff50-1265-41e6-9b6a-d526726f71cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:45:02 crc kubenswrapper[4936]: I0930 13:45:02.647864 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b33ff50-1265-41e6-9b6a-d526726f71cb-kube-api-access-w475b" (OuterVolumeSpecName: "kube-api-access-w475b") pod "0b33ff50-1265-41e6-9b6a-d526726f71cb" (UID: "0b33ff50-1265-41e6-9b6a-d526726f71cb"). InnerVolumeSpecName "kube-api-access-w475b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:45:02 crc kubenswrapper[4936]: I0930 13:45:02.648509 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b33ff50-1265-41e6-9b6a-d526726f71cb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0b33ff50-1265-41e6-9b6a-d526726f71cb" (UID: "0b33ff50-1265-41e6-9b6a-d526726f71cb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:45:02 crc kubenswrapper[4936]: I0930 13:45:02.743450 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w475b\" (UniqueName: \"kubernetes.io/projected/0b33ff50-1265-41e6-9b6a-d526726f71cb-kube-api-access-w475b\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:02 crc kubenswrapper[4936]: I0930 13:45:02.743498 4936 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b33ff50-1265-41e6-9b6a-d526726f71cb-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:02 crc kubenswrapper[4936]: I0930 13:45:02.743513 4936 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b33ff50-1265-41e6-9b6a-d526726f71cb-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:03 crc kubenswrapper[4936]: I0930 13:45:03.360839 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk" event={"ID":"0b33ff50-1265-41e6-9b6a-d526726f71cb","Type":"ContainerDied","Data":"30244d83065faa16bc87cb666159ce7c7e431c0847f4404bf3b0184a3a675a57"} Sep 30 13:45:03 crc kubenswrapper[4936]: I0930 13:45:03.361139 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30244d83065faa16bc87cb666159ce7c7e431c0847f4404bf3b0184a3a675a57" Sep 30 13:45:03 crc kubenswrapper[4936]: I0930 13:45:03.360850 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk" Sep 30 13:45:18 crc kubenswrapper[4936]: I0930 13:45:18.249890 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:45:18 crc kubenswrapper[4936]: I0930 13:45:18.250734 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:45:48 crc kubenswrapper[4936]: I0930 13:45:48.250783 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:45:48 crc kubenswrapper[4936]: I0930 13:45:48.251497 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:45:48 crc kubenswrapper[4936]: I0930 13:45:48.251552 4936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 13:45:48 crc kubenswrapper[4936]: I0930 13:45:48.252207 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b87ca749d8eeb6b23d32d1844ff33cb2b58fa56d3b380cf22d36d326ab9f6e40"} pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:45:48 crc kubenswrapper[4936]: I0930 13:45:48.252285 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" containerID="cri-o://b87ca749d8eeb6b23d32d1844ff33cb2b58fa56d3b380cf22d36d326ab9f6e40" gracePeriod=600 Sep 30 13:45:48 crc kubenswrapper[4936]: I0930 13:45:48.616235 4936 generic.go:334] "Generic (PLEG): container finished" podID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerID="b87ca749d8eeb6b23d32d1844ff33cb2b58fa56d3b380cf22d36d326ab9f6e40" exitCode=0 Sep 30 13:45:48 crc kubenswrapper[4936]: I0930 13:45:48.616517 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerDied","Data":"b87ca749d8eeb6b23d32d1844ff33cb2b58fa56d3b380cf22d36d326ab9f6e40"} Sep 30 13:45:48 crc kubenswrapper[4936]: I0930 13:45:48.616541 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"7a663dbb554115c6a5c0d9c45457fde22c2bc879b1b264710b17f1c647448671"} Sep 30 13:45:48 crc kubenswrapper[4936]: I0930 13:45:48.616557 4936 scope.go:117] "RemoveContainer" containerID="9d8ab847a005870b14035f6422053f52ecb9f3b12f1153722e5f4247278f407c" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.598355 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wsck7"] Sep 30 13:46:28 crc kubenswrapper[4936]: E0930 13:46:28.599069 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b33ff50-1265-41e6-9b6a-d526726f71cb" containerName="collect-profiles" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.599080 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b33ff50-1265-41e6-9b6a-d526726f71cb" containerName="collect-profiles" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.599169 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b33ff50-1265-41e6-9b6a-d526726f71cb" containerName="collect-profiles" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.599522 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.621436 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wsck7"] Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.640958 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b86c88d3-e219-49ef-8417-9727e1dbc465-bound-sa-token\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.641007 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b86c88d3-e219-49ef-8417-9727e1dbc465-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.641046 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b86c88d3-e219-49ef-8417-9727e1dbc465-trusted-ca\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.641081 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.641097 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b86c88d3-e219-49ef-8417-9727e1dbc465-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.641116 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b86c88d3-e219-49ef-8417-9727e1dbc465-registry-certificates\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.641130 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b86c88d3-e219-49ef-8417-9727e1dbc465-registry-tls\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.641153 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlv4k\" (UniqueName: \"kubernetes.io/projected/b86c88d3-e219-49ef-8417-9727e1dbc465-kube-api-access-mlv4k\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.682863 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.742413 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b86c88d3-e219-49ef-8417-9727e1dbc465-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.742466 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b86c88d3-e219-49ef-8417-9727e1dbc465-registry-certificates\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.742485 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b86c88d3-e219-49ef-8417-9727e1dbc465-registry-tls\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.742514 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlv4k\" (UniqueName: \"kubernetes.io/projected/b86c88d3-e219-49ef-8417-9727e1dbc465-kube-api-access-mlv4k\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.742537 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b86c88d3-e219-49ef-8417-9727e1dbc465-bound-sa-token\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.742558 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b86c88d3-e219-49ef-8417-9727e1dbc465-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.742598 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b86c88d3-e219-49ef-8417-9727e1dbc465-trusted-ca\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.744082 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b86c88d3-e219-49ef-8417-9727e1dbc465-trusted-ca\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.744386 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b86c88d3-e219-49ef-8417-9727e1dbc465-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.744684 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b86c88d3-e219-49ef-8417-9727e1dbc465-registry-certificates\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.749436 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b86c88d3-e219-49ef-8417-9727e1dbc465-registry-tls\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.750674 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b86c88d3-e219-49ef-8417-9727e1dbc465-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.759550 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b86c88d3-e219-49ef-8417-9727e1dbc465-bound-sa-token\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.763551 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlv4k\" (UniqueName: \"kubernetes.io/projected/b86c88d3-e219-49ef-8417-9727e1dbc465-kube-api-access-mlv4k\") pod \"image-registry-66df7c8f76-wsck7\" (UID: \"b86c88d3-e219-49ef-8417-9727e1dbc465\") " pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:28 crc kubenswrapper[4936]: I0930 13:46:28.921591 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:29 crc kubenswrapper[4936]: I0930 13:46:29.115657 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wsck7"] Sep 30 13:46:29 crc kubenswrapper[4936]: W0930 13:46:29.134417 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb86c88d3_e219_49ef_8417_9727e1dbc465.slice/crio-340780c33922e97d8e5694ccb89daef8abfc851040807ed5648a0db6e4926b81 WatchSource:0}: Error finding container 340780c33922e97d8e5694ccb89daef8abfc851040807ed5648a0db6e4926b81: Status 404 returned error can't find the container with id 340780c33922e97d8e5694ccb89daef8abfc851040807ed5648a0db6e4926b81 Sep 30 13:46:29 crc kubenswrapper[4936]: I0930 13:46:29.878396 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" event={"ID":"b86c88d3-e219-49ef-8417-9727e1dbc465","Type":"ContainerStarted","Data":"a6ca6e8bf345e8bac8169850dc3554d59f7a78dc14a004a211570d1ef09459d7"} Sep 30 13:46:29 crc kubenswrapper[4936]: I0930 13:46:29.878747 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" event={"ID":"b86c88d3-e219-49ef-8417-9727e1dbc465","Type":"ContainerStarted","Data":"340780c33922e97d8e5694ccb89daef8abfc851040807ed5648a0db6e4926b81"} Sep 30 13:46:29 crc kubenswrapper[4936]: I0930 13:46:29.878764 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:29 crc kubenswrapper[4936]: I0930 13:46:29.897829 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" podStartSLOduration=1.897803551 podStartE2EDuration="1.897803551s" podCreationTimestamp="2025-09-30 13:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:46:29.896465971 +0000 UTC m=+440.280468312" watchObservedRunningTime="2025-09-30 13:46:29.897803551 +0000 UTC m=+440.281805872" Sep 30 13:46:48 crc kubenswrapper[4936]: I0930 13:46:48.929897 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-wsck7" Sep 30 13:46:49 crc kubenswrapper[4936]: I0930 13:46:49.020592 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4vgz6"] Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.072009 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" podUID="8bb902ae-877a-46b0-8972-2ea22f50782c" containerName="registry" containerID="cri-o://ea4c1b34360525507890a77633eeed3602fcbb4670ea0e749024f5a37b70be3e" gracePeriod=30 Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.372573 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.469533 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvxwl\" (UniqueName: \"kubernetes.io/projected/8bb902ae-877a-46b0-8972-2ea22f50782c-kube-api-access-gvxwl\") pod \"8bb902ae-877a-46b0-8972-2ea22f50782c\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.469607 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bb902ae-877a-46b0-8972-2ea22f50782c-trusted-ca\") pod \"8bb902ae-877a-46b0-8972-2ea22f50782c\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.469642 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8bb902ae-877a-46b0-8972-2ea22f50782c-registry-certificates\") pod \"8bb902ae-877a-46b0-8972-2ea22f50782c\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.469847 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8bb902ae-877a-46b0-8972-2ea22f50782c\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.469871 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8bb902ae-877a-46b0-8972-2ea22f50782c-registry-tls\") pod \"8bb902ae-877a-46b0-8972-2ea22f50782c\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.469898 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8bb902ae-877a-46b0-8972-2ea22f50782c-installation-pull-secrets\") pod \"8bb902ae-877a-46b0-8972-2ea22f50782c\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.469934 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8bb902ae-877a-46b0-8972-2ea22f50782c-bound-sa-token\") pod \"8bb902ae-877a-46b0-8972-2ea22f50782c\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.469969 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8bb902ae-877a-46b0-8972-2ea22f50782c-ca-trust-extracted\") pod \"8bb902ae-877a-46b0-8972-2ea22f50782c\" (UID: \"8bb902ae-877a-46b0-8972-2ea22f50782c\") " Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.470471 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bb902ae-877a-46b0-8972-2ea22f50782c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8bb902ae-877a-46b0-8972-2ea22f50782c" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.471193 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bb902ae-877a-46b0-8972-2ea22f50782c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8bb902ae-877a-46b0-8972-2ea22f50782c" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.479927 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb902ae-877a-46b0-8972-2ea22f50782c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8bb902ae-877a-46b0-8972-2ea22f50782c" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.479983 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb902ae-877a-46b0-8972-2ea22f50782c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8bb902ae-877a-46b0-8972-2ea22f50782c" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.481786 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb902ae-877a-46b0-8972-2ea22f50782c-kube-api-access-gvxwl" (OuterVolumeSpecName: "kube-api-access-gvxwl") pod "8bb902ae-877a-46b0-8972-2ea22f50782c" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c"). InnerVolumeSpecName "kube-api-access-gvxwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.482104 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb902ae-877a-46b0-8972-2ea22f50782c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8bb902ae-877a-46b0-8972-2ea22f50782c" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.488030 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bb902ae-877a-46b0-8972-2ea22f50782c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8bb902ae-877a-46b0-8972-2ea22f50782c" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.488269 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "8bb902ae-877a-46b0-8972-2ea22f50782c" (UID: "8bb902ae-877a-46b0-8972-2ea22f50782c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.571321 4936 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8bb902ae-877a-46b0-8972-2ea22f50782c-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.571670 4936 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8bb902ae-877a-46b0-8972-2ea22f50782c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.571750 4936 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8bb902ae-877a-46b0-8972-2ea22f50782c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.571826 4936 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8bb902ae-877a-46b0-8972-2ea22f50782c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.571901 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvxwl\" (UniqueName: \"kubernetes.io/projected/8bb902ae-877a-46b0-8972-2ea22f50782c-kube-api-access-gvxwl\") on node \"crc\" DevicePath \"\"" Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.572029 4936 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bb902ae-877a-46b0-8972-2ea22f50782c-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:47:14 crc kubenswrapper[4936]: I0930 13:47:14.572110 4936 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8bb902ae-877a-46b0-8972-2ea22f50782c-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 13:47:15 crc kubenswrapper[4936]: I0930 13:47:15.173462 4936 generic.go:334] "Generic (PLEG): container finished" podID="8bb902ae-877a-46b0-8972-2ea22f50782c" containerID="ea4c1b34360525507890a77633eeed3602fcbb4670ea0e749024f5a37b70be3e" exitCode=0 Sep 30 13:47:15 crc kubenswrapper[4936]: I0930 13:47:15.173515 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" Sep 30 13:47:15 crc kubenswrapper[4936]: I0930 13:47:15.173545 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" event={"ID":"8bb902ae-877a-46b0-8972-2ea22f50782c","Type":"ContainerDied","Data":"ea4c1b34360525507890a77633eeed3602fcbb4670ea0e749024f5a37b70be3e"} Sep 30 13:47:15 crc kubenswrapper[4936]: I0930 13:47:15.174024 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4vgz6" event={"ID":"8bb902ae-877a-46b0-8972-2ea22f50782c","Type":"ContainerDied","Data":"123dd1f2c206580979bdc33ca51d0e4ff5491a8822662efdfcc6e1ed2ef1004c"} Sep 30 13:47:15 crc kubenswrapper[4936]: I0930 13:47:15.174045 4936 scope.go:117] "RemoveContainer" containerID="ea4c1b34360525507890a77633eeed3602fcbb4670ea0e749024f5a37b70be3e" Sep 30 13:47:15 crc kubenswrapper[4936]: I0930 13:47:15.190554 4936 scope.go:117] "RemoveContainer" containerID="ea4c1b34360525507890a77633eeed3602fcbb4670ea0e749024f5a37b70be3e" Sep 30 13:47:15 crc kubenswrapper[4936]: E0930 13:47:15.191042 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea4c1b34360525507890a77633eeed3602fcbb4670ea0e749024f5a37b70be3e\": container with ID starting with ea4c1b34360525507890a77633eeed3602fcbb4670ea0e749024f5a37b70be3e not found: ID does not exist" containerID="ea4c1b34360525507890a77633eeed3602fcbb4670ea0e749024f5a37b70be3e" Sep 30 13:47:15 crc kubenswrapper[4936]: I0930 13:47:15.191111 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea4c1b34360525507890a77633eeed3602fcbb4670ea0e749024f5a37b70be3e"} err="failed to get container status \"ea4c1b34360525507890a77633eeed3602fcbb4670ea0e749024f5a37b70be3e\": rpc error: code = NotFound desc = could not find container \"ea4c1b34360525507890a77633eeed3602fcbb4670ea0e749024f5a37b70be3e\": container with ID starting with ea4c1b34360525507890a77633eeed3602fcbb4670ea0e749024f5a37b70be3e not found: ID does not exist" Sep 30 13:47:15 crc kubenswrapper[4936]: I0930 13:47:15.213739 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4vgz6"] Sep 30 13:47:15 crc kubenswrapper[4936]: I0930 13:47:15.219395 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4vgz6"] Sep 30 13:47:16 crc kubenswrapper[4936]: I0930 13:47:16.323370 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bb902ae-877a-46b0-8972-2ea22f50782c" path="/var/lib/kubelet/pods/8bb902ae-877a-46b0-8972-2ea22f50782c/volumes" Sep 30 13:47:48 crc kubenswrapper[4936]: I0930 13:47:48.250711 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:47:48 crc kubenswrapper[4936]: I0930 13:47:48.251555 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:48:18 crc kubenswrapper[4936]: I0930 13:48:18.250272 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:48:18 crc kubenswrapper[4936]: I0930 13:48:18.250804 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:48:48 crc kubenswrapper[4936]: I0930 13:48:48.250180 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:48:48 crc kubenswrapper[4936]: I0930 13:48:48.250954 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:48:48 crc kubenswrapper[4936]: I0930 13:48:48.251027 4936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 13:48:48 crc kubenswrapper[4936]: I0930 13:48:48.251887 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7a663dbb554115c6a5c0d9c45457fde22c2bc879b1b264710b17f1c647448671"} pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:48:48 crc kubenswrapper[4936]: I0930 13:48:48.251978 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" containerID="cri-o://7a663dbb554115c6a5c0d9c45457fde22c2bc879b1b264710b17f1c647448671" gracePeriod=600 Sep 30 13:48:48 crc kubenswrapper[4936]: I0930 13:48:48.656479 4936 generic.go:334] "Generic (PLEG): container finished" podID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerID="7a663dbb554115c6a5c0d9c45457fde22c2bc879b1b264710b17f1c647448671" exitCode=0 Sep 30 13:48:48 crc kubenswrapper[4936]: I0930 13:48:48.656546 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerDied","Data":"7a663dbb554115c6a5c0d9c45457fde22c2bc879b1b264710b17f1c647448671"} Sep 30 13:48:48 crc kubenswrapper[4936]: I0930 13:48:48.656802 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"7b27a62cf82d437a70e61d77c0bf6775c7b99f0aab2b41f8875371a920ef34f1"} Sep 30 13:48:48 crc kubenswrapper[4936]: I0930 13:48:48.656819 4936 scope.go:117] "RemoveContainer" containerID="b87ca749d8eeb6b23d32d1844ff33cb2b58fa56d3b380cf22d36d326ab9f6e40" Sep 30 13:49:18 crc kubenswrapper[4936]: I0930 13:49:18.997713 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-xwsfb"] Sep 30 13:49:18 crc kubenswrapper[4936]: E0930 13:49:18.998413 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb902ae-877a-46b0-8972-2ea22f50782c" containerName="registry" Sep 30 13:49:18 crc kubenswrapper[4936]: I0930 13:49:18.998425 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb902ae-877a-46b0-8972-2ea22f50782c" containerName="registry" Sep 30 13:49:18 crc kubenswrapper[4936]: I0930 13:49:18.998544 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb902ae-877a-46b0-8972-2ea22f50782c" containerName="registry" Sep 30 13:49:18 crc kubenswrapper[4936]: I0930 13:49:18.998902 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-xwsfb" Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.001323 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.001536 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.008137 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-wm65j"] Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.008930 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-wm65j" Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.010669 4936 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-knd24" Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.011848 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-xwsfb"] Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.025582 4936 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-xx4dw" Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.028574 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-wm65j"] Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.058372 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-z76zc"] Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.059325 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-z76zc" Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.061570 4936 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-sx62t" Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.075779 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-z76zc"] Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.180464 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tc4g\" (UniqueName: \"kubernetes.io/projected/b59b1137-6114-4d73-8593-250d0da0b741-kube-api-access-9tc4g\") pod \"cert-manager-cainjector-7f985d654d-xwsfb\" (UID: \"b59b1137-6114-4d73-8593-250d0da0b741\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-xwsfb" Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.180610 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8kfw\" (UniqueName: \"kubernetes.io/projected/392d0573-afef-4492-9768-2d9c4830d7b8-kube-api-access-m8kfw\") pod \"cert-manager-5b446d88c5-wm65j\" (UID: \"392d0573-afef-4492-9768-2d9c4830d7b8\") " pod="cert-manager/cert-manager-5b446d88c5-wm65j" Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.180662 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ktvg\" (UniqueName: \"kubernetes.io/projected/a9648b53-2a15-447e-bca5-87692ab32278-kube-api-access-8ktvg\") pod \"cert-manager-webhook-5655c58dd6-z76zc\" (UID: \"a9648b53-2a15-447e-bca5-87692ab32278\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-z76zc" Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.282172 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tc4g\" (UniqueName: \"kubernetes.io/projected/b59b1137-6114-4d73-8593-250d0da0b741-kube-api-access-9tc4g\") pod \"cert-manager-cainjector-7f985d654d-xwsfb\" (UID: \"b59b1137-6114-4d73-8593-250d0da0b741\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-xwsfb" Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.282235 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8kfw\" (UniqueName: \"kubernetes.io/projected/392d0573-afef-4492-9768-2d9c4830d7b8-kube-api-access-m8kfw\") pod \"cert-manager-5b446d88c5-wm65j\" (UID: \"392d0573-afef-4492-9768-2d9c4830d7b8\") " pod="cert-manager/cert-manager-5b446d88c5-wm65j" Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.282265 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ktvg\" (UniqueName: \"kubernetes.io/projected/a9648b53-2a15-447e-bca5-87692ab32278-kube-api-access-8ktvg\") pod \"cert-manager-webhook-5655c58dd6-z76zc\" (UID: \"a9648b53-2a15-447e-bca5-87692ab32278\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-z76zc" Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.300487 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8kfw\" (UniqueName: \"kubernetes.io/projected/392d0573-afef-4492-9768-2d9c4830d7b8-kube-api-access-m8kfw\") pod \"cert-manager-5b446d88c5-wm65j\" (UID: \"392d0573-afef-4492-9768-2d9c4830d7b8\") " pod="cert-manager/cert-manager-5b446d88c5-wm65j" Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.301046 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ktvg\" (UniqueName: \"kubernetes.io/projected/a9648b53-2a15-447e-bca5-87692ab32278-kube-api-access-8ktvg\") pod \"cert-manager-webhook-5655c58dd6-z76zc\" (UID: \"a9648b53-2a15-447e-bca5-87692ab32278\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-z76zc" Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.311002 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tc4g\" (UniqueName: \"kubernetes.io/projected/b59b1137-6114-4d73-8593-250d0da0b741-kube-api-access-9tc4g\") pod \"cert-manager-cainjector-7f985d654d-xwsfb\" (UID: \"b59b1137-6114-4d73-8593-250d0da0b741\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-xwsfb" Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.322194 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-xwsfb" Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.332349 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-wm65j" Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.379224 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-z76zc" Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.606104 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-xwsfb"] Sep 30 13:49:19 crc kubenswrapper[4936]: W0930 13:49:19.609967 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb59b1137_6114_4d73_8593_250d0da0b741.slice/crio-b0f6833ebb1e5b928576c812a9789a8df3658fffadf2f55d146c668d20bcb8c4 WatchSource:0}: Error finding container b0f6833ebb1e5b928576c812a9789a8df3658fffadf2f55d146c668d20bcb8c4: Status 404 returned error can't find the container with id b0f6833ebb1e5b928576c812a9789a8df3658fffadf2f55d146c668d20bcb8c4 Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.612963 4936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.655466 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-z76zc"] Sep 30 13:49:19 crc kubenswrapper[4936]: W0930 13:49:19.661299 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9648b53_2a15_447e_bca5_87692ab32278.slice/crio-deb4777e6c25962890a0638fd486c47a0029e087bc0edcf648d6e1b28abf718d WatchSource:0}: Error finding container deb4777e6c25962890a0638fd486c47a0029e087bc0edcf648d6e1b28abf718d: Status 404 returned error can't find the container with id deb4777e6c25962890a0638fd486c47a0029e087bc0edcf648d6e1b28abf718d Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.744836 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-wm65j"] Sep 30 13:49:19 crc kubenswrapper[4936]: W0930 13:49:19.750191 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod392d0573_afef_4492_9768_2d9c4830d7b8.slice/crio-00c538e313ab49848e511ee8b974200c078d3d166f133a041e91dc1f4f9dbcfd WatchSource:0}: Error finding container 00c538e313ab49848e511ee8b974200c078d3d166f133a041e91dc1f4f9dbcfd: Status 404 returned error can't find the container with id 00c538e313ab49848e511ee8b974200c078d3d166f133a041e91dc1f4f9dbcfd Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.822719 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-xwsfb" event={"ID":"b59b1137-6114-4d73-8593-250d0da0b741","Type":"ContainerStarted","Data":"b0f6833ebb1e5b928576c812a9789a8df3658fffadf2f55d146c668d20bcb8c4"} Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.823868 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-wm65j" event={"ID":"392d0573-afef-4492-9768-2d9c4830d7b8","Type":"ContainerStarted","Data":"00c538e313ab49848e511ee8b974200c078d3d166f133a041e91dc1f4f9dbcfd"} Sep 30 13:49:19 crc kubenswrapper[4936]: I0930 13:49:19.824873 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-z76zc" event={"ID":"a9648b53-2a15-447e-bca5-87692ab32278","Type":"ContainerStarted","Data":"deb4777e6c25962890a0638fd486c47a0029e087bc0edcf648d6e1b28abf718d"} Sep 30 13:49:23 crc kubenswrapper[4936]: I0930 13:49:23.846697 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-xwsfb" event={"ID":"b59b1137-6114-4d73-8593-250d0da0b741","Type":"ContainerStarted","Data":"e7c617afb131211e2f40daf1e21a0c34df081ab57dac258c77436bcbef526918"} Sep 30 13:49:23 crc kubenswrapper[4936]: I0930 13:49:23.847822 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-wm65j" event={"ID":"392d0573-afef-4492-9768-2d9c4830d7b8","Type":"ContainerStarted","Data":"470f47bbe384e4c5f0a33502c1d9ed1cbcb6f1eb23721328ad3c999254410e53"} Sep 30 13:49:23 crc kubenswrapper[4936]: I0930 13:49:23.848944 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-z76zc" event={"ID":"a9648b53-2a15-447e-bca5-87692ab32278","Type":"ContainerStarted","Data":"d2050f8f7518c57f3cc3345eff47d2716da032519fea79f84519b68b13e3fd0a"} Sep 30 13:49:23 crc kubenswrapper[4936]: I0930 13:49:23.849092 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-z76zc" Sep 30 13:49:23 crc kubenswrapper[4936]: I0930 13:49:23.861869 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-xwsfb" podStartSLOduration=2.767998849 podStartE2EDuration="5.86185287s" podCreationTimestamp="2025-09-30 13:49:18 +0000 UTC" firstStartedPulling="2025-09-30 13:49:19.612745508 +0000 UTC m=+609.996747809" lastFinishedPulling="2025-09-30 13:49:22.706599529 +0000 UTC m=+613.090601830" observedRunningTime="2025-09-30 13:49:23.858562176 +0000 UTC m=+614.242564477" watchObservedRunningTime="2025-09-30 13:49:23.86185287 +0000 UTC m=+614.245855171" Sep 30 13:49:23 crc kubenswrapper[4936]: I0930 13:49:23.873561 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-z76zc" podStartSLOduration=1.767553365 podStartE2EDuration="4.873545502s" podCreationTimestamp="2025-09-30 13:49:19 +0000 UTC" firstStartedPulling="2025-09-30 13:49:19.663544153 +0000 UTC m=+610.047546454" lastFinishedPulling="2025-09-30 13:49:22.76953629 +0000 UTC m=+613.153538591" observedRunningTime="2025-09-30 13:49:23.871254397 +0000 UTC m=+614.255256698" watchObservedRunningTime="2025-09-30 13:49:23.873545502 +0000 UTC m=+614.257547793" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.382849 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-z76zc" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.406894 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-wm65j" podStartSLOduration=8.451489603 podStartE2EDuration="11.406875295s" podCreationTimestamp="2025-09-30 13:49:18 +0000 UTC" firstStartedPulling="2025-09-30 13:49:19.752152874 +0000 UTC m=+610.136155175" lastFinishedPulling="2025-09-30 13:49:22.707538566 +0000 UTC m=+613.091540867" observedRunningTime="2025-09-30 13:49:23.887254652 +0000 UTC m=+614.271256953" watchObservedRunningTime="2025-09-30 13:49:29.406875295 +0000 UTC m=+619.790877616" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.471858 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7vnws"] Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.472295 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="nbdb" containerID="cri-o://70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075" gracePeriod=30 Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.472420 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="sbdb" containerID="cri-o://514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb" gracePeriod=30 Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.472460 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="northd" containerID="cri-o://add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183" gracePeriod=30 Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.472439 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1" gracePeriod=30 Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.472507 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovn-acl-logging" containerID="cri-o://6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f" gracePeriod=30 Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.472507 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="kube-rbac-proxy-node" containerID="cri-o://7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495" gracePeriod=30 Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.472259 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovn-controller" containerID="cri-o://e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087" gracePeriod=30 Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.512993 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovnkube-controller" containerID="cri-o://53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302" gracePeriod=30 Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.864892 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vnws_166715eb-a672-4111-b64e-626a0f7b0d74/ovnkube-controller/3.log" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.867425 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vnws_166715eb-a672-4111-b64e-626a0f7b0d74/ovn-acl-logging/0.log" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.867825 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vnws_166715eb-a672-4111-b64e-626a0f7b0d74/ovn-controller/0.log" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.868219 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.879258 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vxjrh_9dbb1e3f-927e-4587-835e-b21370b33262/kube-multus/2.log" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.879809 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vxjrh_9dbb1e3f-927e-4587-835e-b21370b33262/kube-multus/1.log" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.879841 4936 generic.go:334] "Generic (PLEG): container finished" podID="9dbb1e3f-927e-4587-835e-b21370b33262" containerID="326557f59eb0f93aaa69b1eb33489ff2543bcec53c69e20c05066c6bef73b97e" exitCode=2 Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.879887 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vxjrh" event={"ID":"9dbb1e3f-927e-4587-835e-b21370b33262","Type":"ContainerDied","Data":"326557f59eb0f93aaa69b1eb33489ff2543bcec53c69e20c05066c6bef73b97e"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.879925 4936 scope.go:117] "RemoveContainer" containerID="c2dd4dee574c3aee3fe81fee19f41aa90b0a6340eb8677847a2006a1ba906e34" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.880806 4936 scope.go:117] "RemoveContainer" containerID="326557f59eb0f93aaa69b1eb33489ff2543bcec53c69e20c05066c6bef73b97e" Sep 30 13:49:29 crc kubenswrapper[4936]: E0930 13:49:29.881211 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-vxjrh_openshift-multus(9dbb1e3f-927e-4587-835e-b21370b33262)\"" pod="openshift-multus/multus-vxjrh" podUID="9dbb1e3f-927e-4587-835e-b21370b33262" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.884050 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vnws_166715eb-a672-4111-b64e-626a0f7b0d74/ovnkube-controller/3.log" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.888576 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vnws_166715eb-a672-4111-b64e-626a0f7b0d74/ovn-acl-logging/0.log" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889105 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7vnws_166715eb-a672-4111-b64e-626a0f7b0d74/ovn-controller/0.log" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889518 4936 generic.go:334] "Generic (PLEG): container finished" podID="166715eb-a672-4111-b64e-626a0f7b0d74" containerID="53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302" exitCode=0 Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889552 4936 generic.go:334] "Generic (PLEG): container finished" podID="166715eb-a672-4111-b64e-626a0f7b0d74" containerID="514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb" exitCode=0 Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889629 4936 generic.go:334] "Generic (PLEG): container finished" podID="166715eb-a672-4111-b64e-626a0f7b0d74" containerID="70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075" exitCode=0 Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889646 4936 generic.go:334] "Generic (PLEG): container finished" podID="166715eb-a672-4111-b64e-626a0f7b0d74" containerID="add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183" exitCode=0 Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889657 4936 generic.go:334] "Generic (PLEG): container finished" podID="166715eb-a672-4111-b64e-626a0f7b0d74" containerID="a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1" exitCode=0 Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889665 4936 generic.go:334] "Generic (PLEG): container finished" podID="166715eb-a672-4111-b64e-626a0f7b0d74" containerID="7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495" exitCode=0 Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889674 4936 generic.go:334] "Generic (PLEG): container finished" podID="166715eb-a672-4111-b64e-626a0f7b0d74" containerID="6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f" exitCode=143 Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889683 4936 generic.go:334] "Generic (PLEG): container finished" podID="166715eb-a672-4111-b64e-626a0f7b0d74" containerID="e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087" exitCode=143 Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889739 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerDied","Data":"53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889772 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerDied","Data":"514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889811 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerDied","Data":"70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889826 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerDied","Data":"add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889840 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerDied","Data":"a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889892 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerDied","Data":"7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889908 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889920 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889929 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889936 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889962 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889968 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889974 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889981 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889987 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.889994 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890003 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerDied","Data":"6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890015 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890064 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890072 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890078 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890084 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890091 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890119 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890126 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890133 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890139 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890149 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerDied","Data":"e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890161 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890169 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890176 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890214 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890221 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890228 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890234 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890241 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890247 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890253 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890281 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" event={"ID":"166715eb-a672-4111-b64e-626a0f7b0d74","Type":"ContainerDied","Data":"493dfd956fcb83a3139b4a2529fcc5a83c028b75cd93df862ba9bcfaaea7bfc0"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890294 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890301 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890308 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890315 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890321 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890367 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890377 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890384 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890390 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890397 4936 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37"} Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.890598 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7vnws" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.928498 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-whg4j"] Sep 30 13:49:29 crc kubenswrapper[4936]: E0930 13:49:29.928772 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="sbdb" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.928782 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="sbdb" Sep 30 13:49:29 crc kubenswrapper[4936]: E0930 13:49:29.928796 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="kubecfg-setup" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.928802 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="kubecfg-setup" Sep 30 13:49:29 crc kubenswrapper[4936]: E0930 13:49:29.928810 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="nbdb" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.928816 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="nbdb" Sep 30 13:49:29 crc kubenswrapper[4936]: E0930 13:49:29.928826 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovn-acl-logging" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.928832 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovn-acl-logging" Sep 30 13:49:29 crc kubenswrapper[4936]: E0930 13:49:29.928838 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="kube-rbac-proxy-node" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.928846 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="kube-rbac-proxy-node" Sep 30 13:49:29 crc kubenswrapper[4936]: E0930 13:49:29.928853 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovn-controller" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.928858 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovn-controller" Sep 30 13:49:29 crc kubenswrapper[4936]: E0930 13:49:29.928868 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovnkube-controller" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.928873 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovnkube-controller" Sep 30 13:49:29 crc kubenswrapper[4936]: E0930 13:49:29.928880 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.928888 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 13:49:29 crc kubenswrapper[4936]: E0930 13:49:29.928896 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="northd" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.928902 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="northd" Sep 30 13:49:29 crc kubenswrapper[4936]: E0930 13:49:29.928908 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovnkube-controller" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.928913 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovnkube-controller" Sep 30 13:49:29 crc kubenswrapper[4936]: E0930 13:49:29.928922 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovnkube-controller" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.928927 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovnkube-controller" Sep 30 13:49:29 crc kubenswrapper[4936]: E0930 13:49:29.928936 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovnkube-controller" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.928941 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovnkube-controller" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.929037 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovnkube-controller" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.929047 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.929055 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="nbdb" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.929062 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovn-controller" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.929070 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="northd" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.929078 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="sbdb" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.929088 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovn-acl-logging" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.929096 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovnkube-controller" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.929105 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovnkube-controller" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.929112 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovnkube-controller" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.929120 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovnkube-controller" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.929126 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="kube-rbac-proxy-node" Sep 30 13:49:29 crc kubenswrapper[4936]: E0930 13:49:29.929223 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovnkube-controller" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.929231 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" containerName="ovnkube-controller" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.930576 4936 scope.go:117] "RemoveContainer" containerID="53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.931829 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.946150 4936 scope.go:117] "RemoveContainer" containerID="c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.965063 4936 scope.go:117] "RemoveContainer" containerID="514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.976695 4936 scope.go:117] "RemoveContainer" containerID="70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.987843 4936 scope.go:117] "RemoveContainer" containerID="add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183" Sep 30 13:49:29 crc kubenswrapper[4936]: I0930 13:49:29.999370 4936 scope.go:117] "RemoveContainer" containerID="a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.009895 4936 scope.go:117] "RemoveContainer" containerID="7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.022378 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-run-ovn\") pod \"166715eb-a672-4111-b64e-626a0f7b0d74\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.022411 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-run-systemd\") pod \"166715eb-a672-4111-b64e-626a0f7b0d74\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.022432 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-kubelet\") pod \"166715eb-a672-4111-b64e-626a0f7b0d74\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.022444 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-node-log\") pod \"166715eb-a672-4111-b64e-626a0f7b0d74\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.022469 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/166715eb-a672-4111-b64e-626a0f7b0d74-ovn-node-metrics-cert\") pod \"166715eb-a672-4111-b64e-626a0f7b0d74\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.022488 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/166715eb-a672-4111-b64e-626a0f7b0d74-ovnkube-script-lib\") pod \"166715eb-a672-4111-b64e-626a0f7b0d74\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.022502 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-run-netns\") pod \"166715eb-a672-4111-b64e-626a0f7b0d74\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.022513 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-slash\") pod \"166715eb-a672-4111-b64e-626a0f7b0d74\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.022526 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-run-ovn-kubernetes\") pod \"166715eb-a672-4111-b64e-626a0f7b0d74\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.022544 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-var-lib-openvswitch\") pod \"166715eb-a672-4111-b64e-626a0f7b0d74\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.022562 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h77cz\" (UniqueName: \"kubernetes.io/projected/166715eb-a672-4111-b64e-626a0f7b0d74-kube-api-access-h77cz\") pod \"166715eb-a672-4111-b64e-626a0f7b0d74\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.022579 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-log-socket\") pod \"166715eb-a672-4111-b64e-626a0f7b0d74\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.022594 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-var-lib-cni-networks-ovn-kubernetes\") pod \"166715eb-a672-4111-b64e-626a0f7b0d74\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.022608 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-run-openvswitch\") pod \"166715eb-a672-4111-b64e-626a0f7b0d74\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.022627 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/166715eb-a672-4111-b64e-626a0f7b0d74-ovnkube-config\") pod \"166715eb-a672-4111-b64e-626a0f7b0d74\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.022668 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-systemd-units\") pod \"166715eb-a672-4111-b64e-626a0f7b0d74\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.022687 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-cni-bin\") pod \"166715eb-a672-4111-b64e-626a0f7b0d74\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.022714 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/166715eb-a672-4111-b64e-626a0f7b0d74-env-overrides\") pod \"166715eb-a672-4111-b64e-626a0f7b0d74\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.022727 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-cni-netd\") pod \"166715eb-a672-4111-b64e-626a0f7b0d74\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.022744 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-etc-openvswitch\") pod \"166715eb-a672-4111-b64e-626a0f7b0d74\" (UID: \"166715eb-a672-4111-b64e-626a0f7b0d74\") " Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.023209 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "166715eb-a672-4111-b64e-626a0f7b0d74" (UID: "166715eb-a672-4111-b64e-626a0f7b0d74"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.023475 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-log-socket" (OuterVolumeSpecName: "log-socket") pod "166715eb-a672-4111-b64e-626a0f7b0d74" (UID: "166715eb-a672-4111-b64e-626a0f7b0d74"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.023547 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-slash" (OuterVolumeSpecName: "host-slash") pod "166715eb-a672-4111-b64e-626a0f7b0d74" (UID: "166715eb-a672-4111-b64e-626a0f7b0d74"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.023584 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "166715eb-a672-4111-b64e-626a0f7b0d74" (UID: "166715eb-a672-4111-b64e-626a0f7b0d74"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.023605 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "166715eb-a672-4111-b64e-626a0f7b0d74" (UID: "166715eb-a672-4111-b64e-626a0f7b0d74"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.023664 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "166715eb-a672-4111-b64e-626a0f7b0d74" (UID: "166715eb-a672-4111-b64e-626a0f7b0d74"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.023730 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "166715eb-a672-4111-b64e-626a0f7b0d74" (UID: "166715eb-a672-4111-b64e-626a0f7b0d74"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.023755 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-node-log" (OuterVolumeSpecName: "node-log") pod "166715eb-a672-4111-b64e-626a0f7b0d74" (UID: "166715eb-a672-4111-b64e-626a0f7b0d74"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.024655 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "166715eb-a672-4111-b64e-626a0f7b0d74" (UID: "166715eb-a672-4111-b64e-626a0f7b0d74"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.024680 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "166715eb-a672-4111-b64e-626a0f7b0d74" (UID: "166715eb-a672-4111-b64e-626a0f7b0d74"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.024697 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "166715eb-a672-4111-b64e-626a0f7b0d74" (UID: "166715eb-a672-4111-b64e-626a0f7b0d74"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.025034 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/166715eb-a672-4111-b64e-626a0f7b0d74-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "166715eb-a672-4111-b64e-626a0f7b0d74" (UID: "166715eb-a672-4111-b64e-626a0f7b0d74"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.025054 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "166715eb-a672-4111-b64e-626a0f7b0d74" (UID: "166715eb-a672-4111-b64e-626a0f7b0d74"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.025075 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "166715eb-a672-4111-b64e-626a0f7b0d74" (UID: "166715eb-a672-4111-b64e-626a0f7b0d74"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.025401 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "166715eb-a672-4111-b64e-626a0f7b0d74" (UID: "166715eb-a672-4111-b64e-626a0f7b0d74"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.025653 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/166715eb-a672-4111-b64e-626a0f7b0d74-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "166715eb-a672-4111-b64e-626a0f7b0d74" (UID: "166715eb-a672-4111-b64e-626a0f7b0d74"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.025917 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/166715eb-a672-4111-b64e-626a0f7b0d74-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "166715eb-a672-4111-b64e-626a0f7b0d74" (UID: "166715eb-a672-4111-b64e-626a0f7b0d74"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.026776 4936 scope.go:117] "RemoveContainer" containerID="6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.029163 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/166715eb-a672-4111-b64e-626a0f7b0d74-kube-api-access-h77cz" (OuterVolumeSpecName: "kube-api-access-h77cz") pod "166715eb-a672-4111-b64e-626a0f7b0d74" (UID: "166715eb-a672-4111-b64e-626a0f7b0d74"). InnerVolumeSpecName "kube-api-access-h77cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.029491 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/166715eb-a672-4111-b64e-626a0f7b0d74-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "166715eb-a672-4111-b64e-626a0f7b0d74" (UID: "166715eb-a672-4111-b64e-626a0f7b0d74"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.037437 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "166715eb-a672-4111-b64e-626a0f7b0d74" (UID: "166715eb-a672-4111-b64e-626a0f7b0d74"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.038138 4936 scope.go:117] "RemoveContainer" containerID="e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.050555 4936 scope.go:117] "RemoveContainer" containerID="859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.063131 4936 scope.go:117] "RemoveContainer" containerID="53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302" Sep 30 13:49:30 crc kubenswrapper[4936]: E0930 13:49:30.063646 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302\": container with ID starting with 53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302 not found: ID does not exist" containerID="53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.063685 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302"} err="failed to get container status \"53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302\": rpc error: code = NotFound desc = could not find container \"53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302\": container with ID starting with 53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.063743 4936 scope.go:117] "RemoveContainer" containerID="c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35" Sep 30 13:49:30 crc kubenswrapper[4936]: E0930 13:49:30.064120 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35\": container with ID starting with c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35 not found: ID does not exist" containerID="c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.064143 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35"} err="failed to get container status \"c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35\": rpc error: code = NotFound desc = could not find container \"c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35\": container with ID starting with c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.064156 4936 scope.go:117] "RemoveContainer" containerID="514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb" Sep 30 13:49:30 crc kubenswrapper[4936]: E0930 13:49:30.064404 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\": container with ID starting with 514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb not found: ID does not exist" containerID="514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.064498 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb"} err="failed to get container status \"514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\": rpc error: code = NotFound desc = could not find container \"514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\": container with ID starting with 514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.064669 4936 scope.go:117] "RemoveContainer" containerID="70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075" Sep 30 13:49:30 crc kubenswrapper[4936]: E0930 13:49:30.065070 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\": container with ID starting with 70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075 not found: ID does not exist" containerID="70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.065096 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075"} err="failed to get container status \"70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\": rpc error: code = NotFound desc = could not find container \"70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\": container with ID starting with 70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.065113 4936 scope.go:117] "RemoveContainer" containerID="add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183" Sep 30 13:49:30 crc kubenswrapper[4936]: E0930 13:49:30.065573 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\": container with ID starting with add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183 not found: ID does not exist" containerID="add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.065607 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183"} err="failed to get container status \"add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\": rpc error: code = NotFound desc = could not find container \"add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\": container with ID starting with add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.065632 4936 scope.go:117] "RemoveContainer" containerID="a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1" Sep 30 13:49:30 crc kubenswrapper[4936]: E0930 13:49:30.065879 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\": container with ID starting with a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1 not found: ID does not exist" containerID="a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.065906 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1"} err="failed to get container status \"a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\": rpc error: code = NotFound desc = could not find container \"a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\": container with ID starting with a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.065921 4936 scope.go:117] "RemoveContainer" containerID="7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495" Sep 30 13:49:30 crc kubenswrapper[4936]: E0930 13:49:30.066136 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\": container with ID starting with 7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495 not found: ID does not exist" containerID="7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.066223 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495"} err="failed to get container status \"7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\": rpc error: code = NotFound desc = could not find container \"7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\": container with ID starting with 7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.066299 4936 scope.go:117] "RemoveContainer" containerID="6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f" Sep 30 13:49:30 crc kubenswrapper[4936]: E0930 13:49:30.066585 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\": container with ID starting with 6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f not found: ID does not exist" containerID="6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.066616 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f"} err="failed to get container status \"6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\": rpc error: code = NotFound desc = could not find container \"6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\": container with ID starting with 6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.066630 4936 scope.go:117] "RemoveContainer" containerID="e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087" Sep 30 13:49:30 crc kubenswrapper[4936]: E0930 13:49:30.066850 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\": container with ID starting with e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087 not found: ID does not exist" containerID="e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.066923 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087"} err="failed to get container status \"e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\": rpc error: code = NotFound desc = could not find container \"e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\": container with ID starting with e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.066982 4936 scope.go:117] "RemoveContainer" containerID="859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37" Sep 30 13:49:30 crc kubenswrapper[4936]: E0930 13:49:30.067377 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\": container with ID starting with 859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37 not found: ID does not exist" containerID="859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.067407 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37"} err="failed to get container status \"859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\": rpc error: code = NotFound desc = could not find container \"859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\": container with ID starting with 859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.067427 4936 scope.go:117] "RemoveContainer" containerID="53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.067685 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302"} err="failed to get container status \"53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302\": rpc error: code = NotFound desc = could not find container \"53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302\": container with ID starting with 53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.067708 4936 scope.go:117] "RemoveContainer" containerID="c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.067954 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35"} err="failed to get container status \"c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35\": rpc error: code = NotFound desc = could not find container \"c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35\": container with ID starting with c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.068022 4936 scope.go:117] "RemoveContainer" containerID="514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.068299 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb"} err="failed to get container status \"514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\": rpc error: code = NotFound desc = could not find container \"514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\": container with ID starting with 514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.068325 4936 scope.go:117] "RemoveContainer" containerID="70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.068582 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075"} err="failed to get container status \"70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\": rpc error: code = NotFound desc = could not find container \"70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\": container with ID starting with 70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.068694 4936 scope.go:117] "RemoveContainer" containerID="add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.069153 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183"} err="failed to get container status \"add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\": rpc error: code = NotFound desc = could not find container \"add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\": container with ID starting with add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.069221 4936 scope.go:117] "RemoveContainer" containerID="a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.070241 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1"} err="failed to get container status \"a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\": rpc error: code = NotFound desc = could not find container \"a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\": container with ID starting with a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.070361 4936 scope.go:117] "RemoveContainer" containerID="7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.070774 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495"} err="failed to get container status \"7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\": rpc error: code = NotFound desc = could not find container \"7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\": container with ID starting with 7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.070812 4936 scope.go:117] "RemoveContainer" containerID="6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.071078 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f"} err="failed to get container status \"6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\": rpc error: code = NotFound desc = could not find container \"6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\": container with ID starting with 6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.071150 4936 scope.go:117] "RemoveContainer" containerID="e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.071459 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087"} err="failed to get container status \"e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\": rpc error: code = NotFound desc = could not find container \"e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\": container with ID starting with e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.071537 4936 scope.go:117] "RemoveContainer" containerID="859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.071818 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37"} err="failed to get container status \"859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\": rpc error: code = NotFound desc = could not find container \"859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\": container with ID starting with 859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.071895 4936 scope.go:117] "RemoveContainer" containerID="53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.072130 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302"} err="failed to get container status \"53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302\": rpc error: code = NotFound desc = could not find container \"53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302\": container with ID starting with 53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.072155 4936 scope.go:117] "RemoveContainer" containerID="c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.072406 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35"} err="failed to get container status \"c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35\": rpc error: code = NotFound desc = could not find container \"c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35\": container with ID starting with c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.072427 4936 scope.go:117] "RemoveContainer" containerID="514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.072655 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb"} err="failed to get container status \"514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\": rpc error: code = NotFound desc = could not find container \"514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\": container with ID starting with 514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.072753 4936 scope.go:117] "RemoveContainer" containerID="70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.073031 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075"} err="failed to get container status \"70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\": rpc error: code = NotFound desc = could not find container \"70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\": container with ID starting with 70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.073105 4936 scope.go:117] "RemoveContainer" containerID="add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.073368 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183"} err="failed to get container status \"add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\": rpc error: code = NotFound desc = could not find container \"add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\": container with ID starting with add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.073476 4936 scope.go:117] "RemoveContainer" containerID="a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.073784 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1"} err="failed to get container status \"a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\": rpc error: code = NotFound desc = could not find container \"a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\": container with ID starting with a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.073874 4936 scope.go:117] "RemoveContainer" containerID="7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.074164 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495"} err="failed to get container status \"7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\": rpc error: code = NotFound desc = could not find container \"7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\": container with ID starting with 7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.074269 4936 scope.go:117] "RemoveContainer" containerID="6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.074623 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f"} err="failed to get container status \"6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\": rpc error: code = NotFound desc = could not find container \"6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\": container with ID starting with 6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.074706 4936 scope.go:117] "RemoveContainer" containerID="e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.075048 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087"} err="failed to get container status \"e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\": rpc error: code = NotFound desc = could not find container \"e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\": container with ID starting with e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.075148 4936 scope.go:117] "RemoveContainer" containerID="859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.075464 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37"} err="failed to get container status \"859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\": rpc error: code = NotFound desc = could not find container \"859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\": container with ID starting with 859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.075552 4936 scope.go:117] "RemoveContainer" containerID="53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.075891 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302"} err="failed to get container status \"53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302\": rpc error: code = NotFound desc = could not find container \"53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302\": container with ID starting with 53059a639dea0c4fba7f12a45b2c69583871597dfd7399b4f976ef0abafe5302 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.075973 4936 scope.go:117] "RemoveContainer" containerID="c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.076326 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35"} err="failed to get container status \"c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35\": rpc error: code = NotFound desc = could not find container \"c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35\": container with ID starting with c813e4b096114821dbf9c3a2224d5300bca29b6ecc4d5ecc7eb1d1b033927c35 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.076395 4936 scope.go:117] "RemoveContainer" containerID="514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.076663 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb"} err="failed to get container status \"514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\": rpc error: code = NotFound desc = could not find container \"514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb\": container with ID starting with 514cdd73aa1e14455ec4d43fdea58f8dd82d7f4e8c2380222b34d89073a591eb not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.076789 4936 scope.go:117] "RemoveContainer" containerID="70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.077034 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075"} err="failed to get container status \"70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\": rpc error: code = NotFound desc = could not find container \"70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075\": container with ID starting with 70fbaa4b9221a07365fe60bd6717d8f75f628f8e90d54ba20b22bbbd51d77075 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.077110 4936 scope.go:117] "RemoveContainer" containerID="add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.077504 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183"} err="failed to get container status \"add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\": rpc error: code = NotFound desc = could not find container \"add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183\": container with ID starting with add1b5ca704076b72bde96229f24998df5d6d36f636d971622cfefd6c5278183 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.077622 4936 scope.go:117] "RemoveContainer" containerID="a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.077930 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1"} err="failed to get container status \"a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\": rpc error: code = NotFound desc = could not find container \"a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1\": container with ID starting with a9e61c7b35cd614fe9bbdfef49dd995ae8e92470213847eb21990f7157505da1 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.078053 4936 scope.go:117] "RemoveContainer" containerID="7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.078406 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495"} err="failed to get container status \"7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\": rpc error: code = NotFound desc = could not find container \"7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495\": container with ID starting with 7a312d60de80281bedfc92ebfbb3c37a7b22e778e80097b244d803ee8098f495 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.078429 4936 scope.go:117] "RemoveContainer" containerID="6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.078637 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f"} err="failed to get container status \"6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\": rpc error: code = NotFound desc = could not find container \"6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f\": container with ID starting with 6228d60472736ac80add39f252e3c688b4134d00b54a2807a22897f79339569f not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.078731 4936 scope.go:117] "RemoveContainer" containerID="e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.080797 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087"} err="failed to get container status \"e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\": rpc error: code = NotFound desc = could not find container \"e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087\": container with ID starting with e7247ac2660f7c76cbd17c8b1ec8350dda2505ea46f911253645aa1020048087 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.080904 4936 scope.go:117] "RemoveContainer" containerID="859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.081208 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37"} err="failed to get container status \"859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\": rpc error: code = NotFound desc = could not find container \"859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37\": container with ID starting with 859a3f3ff49a604e8806b8cd4ea48a09f7c1c25200d01e860fc933225a1e0b37 not found: ID does not exist" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.124589 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-node-log\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.124637 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-run-ovn\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.124659 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/66a9e916-dabd-47b3-b8ef-53ff7011819c-ovnkube-script-lib\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.124797 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-run-systemd\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.124833 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-run-ovn-kubernetes\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.124869 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-run-openvswitch\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.124896 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/66a9e916-dabd-47b3-b8ef-53ff7011819c-ovnkube-config\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.124955 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-cni-bin\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.124986 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-run-netns\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125006 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/66a9e916-dabd-47b3-b8ef-53ff7011819c-ovn-node-metrics-cert\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125099 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-var-lib-openvswitch\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125118 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d5vw\" (UniqueName: \"kubernetes.io/projected/66a9e916-dabd-47b3-b8ef-53ff7011819c-kube-api-access-8d5vw\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125143 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125182 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-etc-openvswitch\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125208 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/66a9e916-dabd-47b3-b8ef-53ff7011819c-env-overrides\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125230 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-kubelet\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125245 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-systemd-units\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125258 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-cni-netd\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125314 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-slash\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125356 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-log-socket\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125421 4936 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125431 4936 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-run-systemd\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125440 4936 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-kubelet\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125450 4936 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-node-log\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125459 4936 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/166715eb-a672-4111-b64e-626a0f7b0d74-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125469 4936 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/166715eb-a672-4111-b64e-626a0f7b0d74-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125477 4936 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-run-netns\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125484 4936 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-slash\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125493 4936 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125502 4936 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125510 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h77cz\" (UniqueName: \"kubernetes.io/projected/166715eb-a672-4111-b64e-626a0f7b0d74-kube-api-access-h77cz\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125519 4936 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-log-socket\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125527 4936 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125536 4936 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-run-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125545 4936 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/166715eb-a672-4111-b64e-626a0f7b0d74-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125555 4936 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-systemd-units\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125565 4936 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-cni-bin\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125573 4936 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/166715eb-a672-4111-b64e-626a0f7b0d74-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125581 4936 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-host-cni-netd\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.125589 4936 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/166715eb-a672-4111-b64e-626a0f7b0d74-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.218805 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7vnws"] Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.223561 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7vnws"] Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.227497 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-var-lib-openvswitch\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.227638 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d5vw\" (UniqueName: \"kubernetes.io/projected/66a9e916-dabd-47b3-b8ef-53ff7011819c-kube-api-access-8d5vw\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.227739 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.227876 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-etc-openvswitch\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.227965 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/66a9e916-dabd-47b3-b8ef-53ff7011819c-env-overrides\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.228053 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-kubelet\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.228133 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-systemd-units\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.228218 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-cni-netd\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.227609 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-var-lib-openvswitch\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.227995 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.228354 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-kubelet\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.228019 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-etc-openvswitch\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.228500 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-cni-netd\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.228522 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-systemd-units\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.228631 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-slash\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.228727 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-log-socket\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.228826 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-node-log\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.228926 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-run-ovn\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.228758 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-log-socket\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.228831 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/66a9e916-dabd-47b3-b8ef-53ff7011819c-env-overrides\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.228875 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-node-log\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.228673 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-slash\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.229099 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-run-ovn\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.229251 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/66a9e916-dabd-47b3-b8ef-53ff7011819c-ovnkube-script-lib\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.229357 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-run-systemd\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.229443 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-run-ovn-kubernetes\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.229536 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-run-openvswitch\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.229624 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/66a9e916-dabd-47b3-b8ef-53ff7011819c-ovnkube-config\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.229728 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-cni-bin\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.229797 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-run-ovn-kubernetes\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.229841 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-run-systemd\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.229923 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-run-netns\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.230009 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/66a9e916-dabd-47b3-b8ef-53ff7011819c-ovn-node-metrics-cert\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.230318 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/66a9e916-dabd-47b3-b8ef-53ff7011819c-ovnkube-config\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.230383 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-run-openvswitch\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.229771 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/66a9e916-dabd-47b3-b8ef-53ff7011819c-ovnkube-script-lib\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.230424 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-cni-bin\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.230494 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/66a9e916-dabd-47b3-b8ef-53ff7011819c-host-run-netns\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.233774 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/66a9e916-dabd-47b3-b8ef-53ff7011819c-ovn-node-metrics-cert\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.243284 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d5vw\" (UniqueName: \"kubernetes.io/projected/66a9e916-dabd-47b3-b8ef-53ff7011819c-kube-api-access-8d5vw\") pod \"ovnkube-node-whg4j\" (UID: \"66a9e916-dabd-47b3-b8ef-53ff7011819c\") " pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.248319 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.322630 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="166715eb-a672-4111-b64e-626a0f7b0d74" path="/var/lib/kubelet/pods/166715eb-a672-4111-b64e-626a0f7b0d74/volumes" Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.897036 4936 generic.go:334] "Generic (PLEG): container finished" podID="66a9e916-dabd-47b3-b8ef-53ff7011819c" containerID="3b72cc664dd4b001619ce860ecfe5eb49501069cedf2f4ad6d6eacc034264a83" exitCode=0 Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.898399 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" event={"ID":"66a9e916-dabd-47b3-b8ef-53ff7011819c","Type":"ContainerDied","Data":"3b72cc664dd4b001619ce860ecfe5eb49501069cedf2f4ad6d6eacc034264a83"} Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.898507 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" event={"ID":"66a9e916-dabd-47b3-b8ef-53ff7011819c","Type":"ContainerStarted","Data":"7c44f0f2edc7b25ec122d68a43b8752347e19c4b0e1258bcd293397950e1effe"} Sep 30 13:49:30 crc kubenswrapper[4936]: I0930 13:49:30.902666 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vxjrh_9dbb1e3f-927e-4587-835e-b21370b33262/kube-multus/2.log" Sep 30 13:49:31 crc kubenswrapper[4936]: I0930 13:49:31.909649 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" event={"ID":"66a9e916-dabd-47b3-b8ef-53ff7011819c","Type":"ContainerStarted","Data":"2cd0a6fe8e7a404679595ae8ee0bd457e9f175ad51add156f44327bb97b40e2e"} Sep 30 13:49:31 crc kubenswrapper[4936]: I0930 13:49:31.910015 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" event={"ID":"66a9e916-dabd-47b3-b8ef-53ff7011819c","Type":"ContainerStarted","Data":"baa14074d3dd62024ddc2985dee3a5f3dc6ffe36f788ce86ae4bb910fc98208f"} Sep 30 13:49:31 crc kubenswrapper[4936]: I0930 13:49:31.910026 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" event={"ID":"66a9e916-dabd-47b3-b8ef-53ff7011819c","Type":"ContainerStarted","Data":"1d2da9f7ae4660e5b754059b3b34b2f7ed83e608a39c3f6388f9e290f95930fa"} Sep 30 13:49:31 crc kubenswrapper[4936]: I0930 13:49:31.910034 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" event={"ID":"66a9e916-dabd-47b3-b8ef-53ff7011819c","Type":"ContainerStarted","Data":"d0652eecec3b5f8bdb71b91d085b9144c33738b91b6b8a0bb68810ffa8d6ec02"} Sep 30 13:49:31 crc kubenswrapper[4936]: I0930 13:49:31.910043 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" event={"ID":"66a9e916-dabd-47b3-b8ef-53ff7011819c","Type":"ContainerStarted","Data":"25c95d131d2bc46a9d87f3ee3780e5728125e530108d525e359a8b3e963ce43d"} Sep 30 13:49:31 crc kubenswrapper[4936]: I0930 13:49:31.910053 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" event={"ID":"66a9e916-dabd-47b3-b8ef-53ff7011819c","Type":"ContainerStarted","Data":"ae83991307d8a02eb73c9526a1e2a992f6db8cc7481737c19d66c2b14e48124c"} Sep 30 13:49:33 crc kubenswrapper[4936]: I0930 13:49:33.922266 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" event={"ID":"66a9e916-dabd-47b3-b8ef-53ff7011819c","Type":"ContainerStarted","Data":"a95e2c1fedd7aa97985712a79f5be2dc6e4137381da8fda36aba775e8a5cdf23"} Sep 30 13:49:36 crc kubenswrapper[4936]: I0930 13:49:36.940452 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" event={"ID":"66a9e916-dabd-47b3-b8ef-53ff7011819c","Type":"ContainerStarted","Data":"8eb5e2097f6bc9283e8d974d1e0ec439362fc66489d46b3b00f7410d4b3bff31"} Sep 30 13:49:36 crc kubenswrapper[4936]: I0930 13:49:36.941008 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:36 crc kubenswrapper[4936]: I0930 13:49:36.941021 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:36 crc kubenswrapper[4936]: I0930 13:49:36.941030 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:36 crc kubenswrapper[4936]: I0930 13:49:36.968168 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:36 crc kubenswrapper[4936]: I0930 13:49:36.968244 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:49:36 crc kubenswrapper[4936]: I0930 13:49:36.978945 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" podStartSLOduration=7.978927489 podStartE2EDuration="7.978927489s" podCreationTimestamp="2025-09-30 13:49:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:49:36.976592432 +0000 UTC m=+627.360594733" watchObservedRunningTime="2025-09-30 13:49:36.978927489 +0000 UTC m=+627.362929790" Sep 30 13:49:43 crc kubenswrapper[4936]: I0930 13:49:43.315614 4936 scope.go:117] "RemoveContainer" containerID="326557f59eb0f93aaa69b1eb33489ff2543bcec53c69e20c05066c6bef73b97e" Sep 30 13:49:43 crc kubenswrapper[4936]: E0930 13:49:43.316308 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-vxjrh_openshift-multus(9dbb1e3f-927e-4587-835e-b21370b33262)\"" pod="openshift-multus/multus-vxjrh" podUID="9dbb1e3f-927e-4587-835e-b21370b33262" Sep 30 13:49:55 crc kubenswrapper[4936]: I0930 13:49:55.314845 4936 scope.go:117] "RemoveContainer" containerID="326557f59eb0f93aaa69b1eb33489ff2543bcec53c69e20c05066c6bef73b97e" Sep 30 13:49:56 crc kubenswrapper[4936]: I0930 13:49:56.043049 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vxjrh_9dbb1e3f-927e-4587-835e-b21370b33262/kube-multus/2.log" Sep 30 13:49:56 crc kubenswrapper[4936]: I0930 13:49:56.043692 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vxjrh" event={"ID":"9dbb1e3f-927e-4587-835e-b21370b33262","Type":"ContainerStarted","Data":"20bd77b94f7889eaabfc6ed523eae099ff00696e3c35799023e4d33e9433b70b"} Sep 30 13:50:00 crc kubenswrapper[4936]: I0930 13:50:00.278466 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-whg4j" Sep 30 13:50:08 crc kubenswrapper[4936]: I0930 13:50:08.575930 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76"] Sep 30 13:50:08 crc kubenswrapper[4936]: I0930 13:50:08.577383 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76" Sep 30 13:50:08 crc kubenswrapper[4936]: I0930 13:50:08.585357 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 13:50:08 crc kubenswrapper[4936]: I0930 13:50:08.588974 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76"] Sep 30 13:50:08 crc kubenswrapper[4936]: I0930 13:50:08.642663 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp6nn\" (UniqueName: \"kubernetes.io/projected/9ece5e6c-214d-460e-bedf-19196f994946-kube-api-access-rp6nn\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76\" (UID: \"9ece5e6c-214d-460e-bedf-19196f994946\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76" Sep 30 13:50:08 crc kubenswrapper[4936]: I0930 13:50:08.642709 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ece5e6c-214d-460e-bedf-19196f994946-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76\" (UID: \"9ece5e6c-214d-460e-bedf-19196f994946\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76" Sep 30 13:50:08 crc kubenswrapper[4936]: I0930 13:50:08.642741 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ece5e6c-214d-460e-bedf-19196f994946-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76\" (UID: \"9ece5e6c-214d-460e-bedf-19196f994946\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76" Sep 30 13:50:08 crc kubenswrapper[4936]: I0930 13:50:08.744225 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ece5e6c-214d-460e-bedf-19196f994946-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76\" (UID: \"9ece5e6c-214d-460e-bedf-19196f994946\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76" Sep 30 13:50:08 crc kubenswrapper[4936]: I0930 13:50:08.744273 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp6nn\" (UniqueName: \"kubernetes.io/projected/9ece5e6c-214d-460e-bedf-19196f994946-kube-api-access-rp6nn\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76\" (UID: \"9ece5e6c-214d-460e-bedf-19196f994946\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76" Sep 30 13:50:08 crc kubenswrapper[4936]: I0930 13:50:08.744299 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ece5e6c-214d-460e-bedf-19196f994946-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76\" (UID: \"9ece5e6c-214d-460e-bedf-19196f994946\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76" Sep 30 13:50:08 crc kubenswrapper[4936]: I0930 13:50:08.744757 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ece5e6c-214d-460e-bedf-19196f994946-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76\" (UID: \"9ece5e6c-214d-460e-bedf-19196f994946\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76" Sep 30 13:50:08 crc kubenswrapper[4936]: I0930 13:50:08.744994 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ece5e6c-214d-460e-bedf-19196f994946-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76\" (UID: \"9ece5e6c-214d-460e-bedf-19196f994946\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76" Sep 30 13:50:08 crc kubenswrapper[4936]: I0930 13:50:08.763181 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp6nn\" (UniqueName: \"kubernetes.io/projected/9ece5e6c-214d-460e-bedf-19196f994946-kube-api-access-rp6nn\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76\" (UID: \"9ece5e6c-214d-460e-bedf-19196f994946\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76" Sep 30 13:50:08 crc kubenswrapper[4936]: I0930 13:50:08.894161 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76" Sep 30 13:50:09 crc kubenswrapper[4936]: I0930 13:50:09.072643 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76"] Sep 30 13:50:09 crc kubenswrapper[4936]: W0930 13:50:09.077907 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ece5e6c_214d_460e_bedf_19196f994946.slice/crio-9665b2187503eee2c8fb0b1b896d1441e2fbdf7cb1fc794e9a5b1f8fa90bbbb5 WatchSource:0}: Error finding container 9665b2187503eee2c8fb0b1b896d1441e2fbdf7cb1fc794e9a5b1f8fa90bbbb5: Status 404 returned error can't find the container with id 9665b2187503eee2c8fb0b1b896d1441e2fbdf7cb1fc794e9a5b1f8fa90bbbb5 Sep 30 13:50:09 crc kubenswrapper[4936]: I0930 13:50:09.114959 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76" event={"ID":"9ece5e6c-214d-460e-bedf-19196f994946","Type":"ContainerStarted","Data":"9665b2187503eee2c8fb0b1b896d1441e2fbdf7cb1fc794e9a5b1f8fa90bbbb5"} Sep 30 13:50:10 crc kubenswrapper[4936]: I0930 13:50:10.123651 4936 generic.go:334] "Generic (PLEG): container finished" podID="9ece5e6c-214d-460e-bedf-19196f994946" containerID="0d41071fb5682cc52083bae1f472f2f1afd3a68b5b7ef34dce887b1f867841c6" exitCode=0 Sep 30 13:50:10 crc kubenswrapper[4936]: I0930 13:50:10.123698 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76" event={"ID":"9ece5e6c-214d-460e-bedf-19196f994946","Type":"ContainerDied","Data":"0d41071fb5682cc52083bae1f472f2f1afd3a68b5b7ef34dce887b1f867841c6"} Sep 30 13:50:12 crc kubenswrapper[4936]: I0930 13:50:12.133090 4936 generic.go:334] "Generic (PLEG): container finished" podID="9ece5e6c-214d-460e-bedf-19196f994946" containerID="5f91ba8390b12e513f1d353f3438ced109cc3978841f610f9dc1ecdb58e1f53b" exitCode=0 Sep 30 13:50:12 crc kubenswrapper[4936]: I0930 13:50:12.133158 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76" event={"ID":"9ece5e6c-214d-460e-bedf-19196f994946","Type":"ContainerDied","Data":"5f91ba8390b12e513f1d353f3438ced109cc3978841f610f9dc1ecdb58e1f53b"} Sep 30 13:50:13 crc kubenswrapper[4936]: I0930 13:50:13.144023 4936 generic.go:334] "Generic (PLEG): container finished" podID="9ece5e6c-214d-460e-bedf-19196f994946" containerID="96cda95ac3fbe3cfd298fb6595c04dce249eafca27c1a728ffe88cdc1a40f5d5" exitCode=0 Sep 30 13:50:13 crc kubenswrapper[4936]: I0930 13:50:13.144080 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76" event={"ID":"9ece5e6c-214d-460e-bedf-19196f994946","Type":"ContainerDied","Data":"96cda95ac3fbe3cfd298fb6595c04dce249eafca27c1a728ffe88cdc1a40f5d5"} Sep 30 13:50:14 crc kubenswrapper[4936]: I0930 13:50:14.362938 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76" Sep 30 13:50:14 crc kubenswrapper[4936]: I0930 13:50:14.521961 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ece5e6c-214d-460e-bedf-19196f994946-util\") pod \"9ece5e6c-214d-460e-bedf-19196f994946\" (UID: \"9ece5e6c-214d-460e-bedf-19196f994946\") " Sep 30 13:50:14 crc kubenswrapper[4936]: I0930 13:50:14.522041 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ece5e6c-214d-460e-bedf-19196f994946-bundle\") pod \"9ece5e6c-214d-460e-bedf-19196f994946\" (UID: \"9ece5e6c-214d-460e-bedf-19196f994946\") " Sep 30 13:50:14 crc kubenswrapper[4936]: I0930 13:50:14.522067 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp6nn\" (UniqueName: \"kubernetes.io/projected/9ece5e6c-214d-460e-bedf-19196f994946-kube-api-access-rp6nn\") pod \"9ece5e6c-214d-460e-bedf-19196f994946\" (UID: \"9ece5e6c-214d-460e-bedf-19196f994946\") " Sep 30 13:50:14 crc kubenswrapper[4936]: I0930 13:50:14.522577 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ece5e6c-214d-460e-bedf-19196f994946-bundle" (OuterVolumeSpecName: "bundle") pod "9ece5e6c-214d-460e-bedf-19196f994946" (UID: "9ece5e6c-214d-460e-bedf-19196f994946"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:50:14 crc kubenswrapper[4936]: I0930 13:50:14.526975 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ece5e6c-214d-460e-bedf-19196f994946-kube-api-access-rp6nn" (OuterVolumeSpecName: "kube-api-access-rp6nn") pod "9ece5e6c-214d-460e-bedf-19196f994946" (UID: "9ece5e6c-214d-460e-bedf-19196f994946"). InnerVolumeSpecName "kube-api-access-rp6nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:50:14 crc kubenswrapper[4936]: I0930 13:50:14.623199 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp6nn\" (UniqueName: \"kubernetes.io/projected/9ece5e6c-214d-460e-bedf-19196f994946-kube-api-access-rp6nn\") on node \"crc\" DevicePath \"\"" Sep 30 13:50:14 crc kubenswrapper[4936]: I0930 13:50:14.623234 4936 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ece5e6c-214d-460e-bedf-19196f994946-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:50:14 crc kubenswrapper[4936]: I0930 13:50:14.697625 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ece5e6c-214d-460e-bedf-19196f994946-util" (OuterVolumeSpecName: "util") pod "9ece5e6c-214d-460e-bedf-19196f994946" (UID: "9ece5e6c-214d-460e-bedf-19196f994946"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:50:14 crc kubenswrapper[4936]: I0930 13:50:14.724982 4936 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ece5e6c-214d-460e-bedf-19196f994946-util\") on node \"crc\" DevicePath \"\"" Sep 30 13:50:15 crc kubenswrapper[4936]: I0930 13:50:15.156160 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76" event={"ID":"9ece5e6c-214d-460e-bedf-19196f994946","Type":"ContainerDied","Data":"9665b2187503eee2c8fb0b1b896d1441e2fbdf7cb1fc794e9a5b1f8fa90bbbb5"} Sep 30 13:50:15 crc kubenswrapper[4936]: I0930 13:50:15.156283 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9665b2187503eee2c8fb0b1b896d1441e2fbdf7cb1fc794e9a5b1f8fa90bbbb5" Sep 30 13:50:15 crc kubenswrapper[4936]: I0930 13:50:15.156225 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76" Sep 30 13:50:17 crc kubenswrapper[4936]: I0930 13:50:17.610168 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-cln2v"] Sep 30 13:50:17 crc kubenswrapper[4936]: E0930 13:50:17.610432 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ece5e6c-214d-460e-bedf-19196f994946" containerName="util" Sep 30 13:50:17 crc kubenswrapper[4936]: I0930 13:50:17.610448 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ece5e6c-214d-460e-bedf-19196f994946" containerName="util" Sep 30 13:50:17 crc kubenswrapper[4936]: E0930 13:50:17.610469 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ece5e6c-214d-460e-bedf-19196f994946" containerName="pull" Sep 30 13:50:17 crc kubenswrapper[4936]: I0930 13:50:17.610478 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ece5e6c-214d-460e-bedf-19196f994946" containerName="pull" Sep 30 13:50:17 crc kubenswrapper[4936]: E0930 13:50:17.610492 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ece5e6c-214d-460e-bedf-19196f994946" containerName="extract" Sep 30 13:50:17 crc kubenswrapper[4936]: I0930 13:50:17.610500 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ece5e6c-214d-460e-bedf-19196f994946" containerName="extract" Sep 30 13:50:17 crc kubenswrapper[4936]: I0930 13:50:17.610629 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ece5e6c-214d-460e-bedf-19196f994946" containerName="extract" Sep 30 13:50:17 crc kubenswrapper[4936]: I0930 13:50:17.611058 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-cln2v" Sep 30 13:50:17 crc kubenswrapper[4936]: I0930 13:50:17.613924 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-r6d6j" Sep 30 13:50:17 crc kubenswrapper[4936]: I0930 13:50:17.614290 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Sep 30 13:50:17 crc kubenswrapper[4936]: I0930 13:50:17.615228 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Sep 30 13:50:17 crc kubenswrapper[4936]: I0930 13:50:17.629874 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-cln2v"] Sep 30 13:50:17 crc kubenswrapper[4936]: I0930 13:50:17.658822 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqrvn\" (UniqueName: \"kubernetes.io/projected/4409f314-97f6-4776-980f-c7727fa7fd18-kube-api-access-sqrvn\") pod \"nmstate-operator-5d6f6cfd66-cln2v\" (UID: \"4409f314-97f6-4776-980f-c7727fa7fd18\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-cln2v" Sep 30 13:50:17 crc kubenswrapper[4936]: I0930 13:50:17.759789 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqrvn\" (UniqueName: \"kubernetes.io/projected/4409f314-97f6-4776-980f-c7727fa7fd18-kube-api-access-sqrvn\") pod \"nmstate-operator-5d6f6cfd66-cln2v\" (UID: \"4409f314-97f6-4776-980f-c7727fa7fd18\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-cln2v" Sep 30 13:50:17 crc kubenswrapper[4936]: I0930 13:50:17.784704 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqrvn\" (UniqueName: \"kubernetes.io/projected/4409f314-97f6-4776-980f-c7727fa7fd18-kube-api-access-sqrvn\") pod \"nmstate-operator-5d6f6cfd66-cln2v\" (UID: \"4409f314-97f6-4776-980f-c7727fa7fd18\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-cln2v" Sep 30 13:50:17 crc kubenswrapper[4936]: I0930 13:50:17.925851 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-cln2v" Sep 30 13:50:18 crc kubenswrapper[4936]: I0930 13:50:18.187214 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-cln2v"] Sep 30 13:50:18 crc kubenswrapper[4936]: W0930 13:50:18.194543 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4409f314_97f6_4776_980f_c7727fa7fd18.slice/crio-5399739d621f8de8cadb74343d69ca310cf9c0e37873c4f57ea9692d64f3b05f WatchSource:0}: Error finding container 5399739d621f8de8cadb74343d69ca310cf9c0e37873c4f57ea9692d64f3b05f: Status 404 returned error can't find the container with id 5399739d621f8de8cadb74343d69ca310cf9c0e37873c4f57ea9692d64f3b05f Sep 30 13:50:19 crc kubenswrapper[4936]: I0930 13:50:19.178916 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-cln2v" event={"ID":"4409f314-97f6-4776-980f-c7727fa7fd18","Type":"ContainerStarted","Data":"5399739d621f8de8cadb74343d69ca310cf9c0e37873c4f57ea9692d64f3b05f"} Sep 30 13:50:21 crc kubenswrapper[4936]: I0930 13:50:21.191707 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-cln2v" event={"ID":"4409f314-97f6-4776-980f-c7727fa7fd18","Type":"ContainerStarted","Data":"d3de922b95fc091100f27ee632f4f530cad232ac60e8f5a4fe9a94a456162053"} Sep 30 13:50:21 crc kubenswrapper[4936]: I0930 13:50:21.207050 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-cln2v" podStartSLOduration=2.065527792 podStartE2EDuration="4.207034463s" podCreationTimestamp="2025-09-30 13:50:17 +0000 UTC" firstStartedPulling="2025-09-30 13:50:18.195937193 +0000 UTC m=+668.579939494" lastFinishedPulling="2025-09-30 13:50:20.337443864 +0000 UTC m=+670.721446165" observedRunningTime="2025-09-30 13:50:21.205196411 +0000 UTC m=+671.589198732" watchObservedRunningTime="2025-09-30 13:50:21.207034463 +0000 UTC m=+671.591036764" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.158600 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-gj98k"] Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.159834 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-gj98k" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.161683 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-5dp7q" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.178035 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-gj98k"] Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.191321 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-lgvdl"] Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.201084 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-284mm"] Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.201923 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-284mm" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.202385 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-lgvdl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.206626 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.218523 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-284mm"] Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.227653 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhrkm\" (UniqueName: \"kubernetes.io/projected/e80b68f2-d68e-4499-84a4-8a83b18922c6-kube-api-access-hhrkm\") pod \"nmstate-metrics-58fcddf996-gj98k\" (UID: \"e80b68f2-d68e-4499-84a4-8a83b18922c6\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-gj98k" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.227692 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/50cdc3bf-7d9e-4644-87b2-81e93c15174a-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-284mm\" (UID: \"50cdc3bf-7d9e-4644-87b2-81e93c15174a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-284mm" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.227735 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6b791ba1-37e6-440f-899d-7db4972b74f5-ovs-socket\") pod \"nmstate-handler-lgvdl\" (UID: \"6b791ba1-37e6-440f-899d-7db4972b74f5\") " pod="openshift-nmstate/nmstate-handler-lgvdl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.227760 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6b791ba1-37e6-440f-899d-7db4972b74f5-dbus-socket\") pod \"nmstate-handler-lgvdl\" (UID: \"6b791ba1-37e6-440f-899d-7db4972b74f5\") " pod="openshift-nmstate/nmstate-handler-lgvdl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.227781 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znzsr\" (UniqueName: \"kubernetes.io/projected/6b791ba1-37e6-440f-899d-7db4972b74f5-kube-api-access-znzsr\") pod \"nmstate-handler-lgvdl\" (UID: \"6b791ba1-37e6-440f-899d-7db4972b74f5\") " pod="openshift-nmstate/nmstate-handler-lgvdl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.227811 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6b791ba1-37e6-440f-899d-7db4972b74f5-nmstate-lock\") pod \"nmstate-handler-lgvdl\" (UID: \"6b791ba1-37e6-440f-899d-7db4972b74f5\") " pod="openshift-nmstate/nmstate-handler-lgvdl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.227828 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5c6x\" (UniqueName: \"kubernetes.io/projected/50cdc3bf-7d9e-4644-87b2-81e93c15174a-kube-api-access-x5c6x\") pod \"nmstate-webhook-6d689559c5-284mm\" (UID: \"50cdc3bf-7d9e-4644-87b2-81e93c15174a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-284mm" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.332140 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znzsr\" (UniqueName: \"kubernetes.io/projected/6b791ba1-37e6-440f-899d-7db4972b74f5-kube-api-access-znzsr\") pod \"nmstate-handler-lgvdl\" (UID: \"6b791ba1-37e6-440f-899d-7db4972b74f5\") " pod="openshift-nmstate/nmstate-handler-lgvdl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.332470 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6b791ba1-37e6-440f-899d-7db4972b74f5-nmstate-lock\") pod \"nmstate-handler-lgvdl\" (UID: \"6b791ba1-37e6-440f-899d-7db4972b74f5\") " pod="openshift-nmstate/nmstate-handler-lgvdl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.332491 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5c6x\" (UniqueName: \"kubernetes.io/projected/50cdc3bf-7d9e-4644-87b2-81e93c15174a-kube-api-access-x5c6x\") pod \"nmstate-webhook-6d689559c5-284mm\" (UID: \"50cdc3bf-7d9e-4644-87b2-81e93c15174a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-284mm" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.332535 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhrkm\" (UniqueName: \"kubernetes.io/projected/e80b68f2-d68e-4499-84a4-8a83b18922c6-kube-api-access-hhrkm\") pod \"nmstate-metrics-58fcddf996-gj98k\" (UID: \"e80b68f2-d68e-4499-84a4-8a83b18922c6\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-gj98k" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.332559 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/50cdc3bf-7d9e-4644-87b2-81e93c15174a-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-284mm\" (UID: \"50cdc3bf-7d9e-4644-87b2-81e93c15174a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-284mm" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.332590 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6b791ba1-37e6-440f-899d-7db4972b74f5-ovs-socket\") pod \"nmstate-handler-lgvdl\" (UID: \"6b791ba1-37e6-440f-899d-7db4972b74f5\") " pod="openshift-nmstate/nmstate-handler-lgvdl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.332615 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6b791ba1-37e6-440f-899d-7db4972b74f5-dbus-socket\") pod \"nmstate-handler-lgvdl\" (UID: \"6b791ba1-37e6-440f-899d-7db4972b74f5\") " pod="openshift-nmstate/nmstate-handler-lgvdl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.332872 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6b791ba1-37e6-440f-899d-7db4972b74f5-dbus-socket\") pod \"nmstate-handler-lgvdl\" (UID: \"6b791ba1-37e6-440f-899d-7db4972b74f5\") " pod="openshift-nmstate/nmstate-handler-lgvdl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.333208 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6b791ba1-37e6-440f-899d-7db4972b74f5-nmstate-lock\") pod \"nmstate-handler-lgvdl\" (UID: \"6b791ba1-37e6-440f-899d-7db4972b74f5\") " pod="openshift-nmstate/nmstate-handler-lgvdl" Sep 30 13:50:22 crc kubenswrapper[4936]: E0930 13:50:22.333543 4936 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Sep 30 13:50:22 crc kubenswrapper[4936]: E0930 13:50:22.333591 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50cdc3bf-7d9e-4644-87b2-81e93c15174a-tls-key-pair podName:50cdc3bf-7d9e-4644-87b2-81e93c15174a nodeName:}" failed. No retries permitted until 2025-09-30 13:50:22.833577506 +0000 UTC m=+673.217579797 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/50cdc3bf-7d9e-4644-87b2-81e93c15174a-tls-key-pair") pod "nmstate-webhook-6d689559c5-284mm" (UID: "50cdc3bf-7d9e-4644-87b2-81e93c15174a") : secret "openshift-nmstate-webhook" not found Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.333773 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6b791ba1-37e6-440f-899d-7db4972b74f5-ovs-socket\") pod \"nmstate-handler-lgvdl\" (UID: \"6b791ba1-37e6-440f-899d-7db4972b74f5\") " pod="openshift-nmstate/nmstate-handler-lgvdl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.340402 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-ngb5v"] Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.341319 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-ngb5v" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.364124 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-4rstn" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.367486 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.368181 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.370817 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-ngb5v"] Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.393828 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5c6x\" (UniqueName: \"kubernetes.io/projected/50cdc3bf-7d9e-4644-87b2-81e93c15174a-kube-api-access-x5c6x\") pod \"nmstate-webhook-6d689559c5-284mm\" (UID: \"50cdc3bf-7d9e-4644-87b2-81e93c15174a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-284mm" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.396666 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhrkm\" (UniqueName: \"kubernetes.io/projected/e80b68f2-d68e-4499-84a4-8a83b18922c6-kube-api-access-hhrkm\") pod \"nmstate-metrics-58fcddf996-gj98k\" (UID: \"e80b68f2-d68e-4499-84a4-8a83b18922c6\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-gj98k" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.400737 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znzsr\" (UniqueName: \"kubernetes.io/projected/6b791ba1-37e6-440f-899d-7db4972b74f5-kube-api-access-znzsr\") pod \"nmstate-handler-lgvdl\" (UID: \"6b791ba1-37e6-440f-899d-7db4972b74f5\") " pod="openshift-nmstate/nmstate-handler-lgvdl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.433557 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f690b866-399e-4bf7-bed4-c261098bfbb1-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-ngb5v\" (UID: \"f690b866-399e-4bf7-bed4-c261098bfbb1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-ngb5v" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.433654 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h628g\" (UniqueName: \"kubernetes.io/projected/f690b866-399e-4bf7-bed4-c261098bfbb1-kube-api-access-h628g\") pod \"nmstate-console-plugin-864bb6dfb5-ngb5v\" (UID: \"f690b866-399e-4bf7-bed4-c261098bfbb1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-ngb5v" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.433679 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f690b866-399e-4bf7-bed4-c261098bfbb1-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-ngb5v\" (UID: \"f690b866-399e-4bf7-bed4-c261098bfbb1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-ngb5v" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.476108 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-gj98k" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.533661 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-lgvdl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.537185 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f690b866-399e-4bf7-bed4-c261098bfbb1-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-ngb5v\" (UID: \"f690b866-399e-4bf7-bed4-c261098bfbb1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-ngb5v" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.537482 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h628g\" (UniqueName: \"kubernetes.io/projected/f690b866-399e-4bf7-bed4-c261098bfbb1-kube-api-access-h628g\") pod \"nmstate-console-plugin-864bb6dfb5-ngb5v\" (UID: \"f690b866-399e-4bf7-bed4-c261098bfbb1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-ngb5v" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.537509 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f690b866-399e-4bf7-bed4-c261098bfbb1-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-ngb5v\" (UID: \"f690b866-399e-4bf7-bed4-c261098bfbb1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-ngb5v" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.538686 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f690b866-399e-4bf7-bed4-c261098bfbb1-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-ngb5v\" (UID: \"f690b866-399e-4bf7-bed4-c261098bfbb1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-ngb5v" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.545816 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f690b866-399e-4bf7-bed4-c261098bfbb1-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-ngb5v\" (UID: \"f690b866-399e-4bf7-bed4-c261098bfbb1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-ngb5v" Sep 30 13:50:22 crc kubenswrapper[4936]: W0930 13:50:22.554110 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b791ba1_37e6_440f_899d_7db4972b74f5.slice/crio-b09cf08039ebbdd3b8d9fd43fd9fd00cbf2274933f66e97474e6d08e3e822922 WatchSource:0}: Error finding container b09cf08039ebbdd3b8d9fd43fd9fd00cbf2274933f66e97474e6d08e3e822922: Status 404 returned error can't find the container with id b09cf08039ebbdd3b8d9fd43fd9fd00cbf2274933f66e97474e6d08e3e822922 Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.561175 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h628g\" (UniqueName: \"kubernetes.io/projected/f690b866-399e-4bf7-bed4-c261098bfbb1-kube-api-access-h628g\") pod \"nmstate-console-plugin-864bb6dfb5-ngb5v\" (UID: \"f690b866-399e-4bf7-bed4-c261098bfbb1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-ngb5v" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.568821 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-547549646b-2g2dl"] Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.571673 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.579208 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-547549646b-2g2dl"] Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.639006 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-console-oauth-config\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.639263 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-console-config\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.639292 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-oauth-serving-cert\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.639345 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-service-ca\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.639367 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-trusted-ca-bundle\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.639403 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwk49\" (UniqueName: \"kubernetes.io/projected/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-kube-api-access-nwk49\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.639436 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-console-serving-cert\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.677879 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-ngb5v" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.740709 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwk49\" (UniqueName: \"kubernetes.io/projected/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-kube-api-access-nwk49\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.740763 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-console-serving-cert\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.740793 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-console-oauth-config\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.740810 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-console-config\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.740832 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-oauth-serving-cert\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.740866 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-service-ca\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.740906 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-trusted-ca-bundle\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.741877 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-trusted-ca-bundle\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.743562 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-console-config\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.743800 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-oauth-serving-cert\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.743828 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-service-ca\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.746800 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-console-oauth-config\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.748071 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-console-serving-cert\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.764800 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwk49\" (UniqueName: \"kubernetes.io/projected/2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc-kube-api-access-nwk49\") pod \"console-547549646b-2g2dl\" (UID: \"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc\") " pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.768166 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-gj98k"] Sep 30 13:50:22 crc kubenswrapper[4936]: W0930 13:50:22.776291 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode80b68f2_d68e_4499_84a4_8a83b18922c6.slice/crio-e29911e45ae8404c11f6bab9d25bc2b22e0fce3bd362ffb26264d7f98c21b54a WatchSource:0}: Error finding container e29911e45ae8404c11f6bab9d25bc2b22e0fce3bd362ffb26264d7f98c21b54a: Status 404 returned error can't find the container with id e29911e45ae8404c11f6bab9d25bc2b22e0fce3bd362ffb26264d7f98c21b54a Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.842345 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/50cdc3bf-7d9e-4644-87b2-81e93c15174a-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-284mm\" (UID: \"50cdc3bf-7d9e-4644-87b2-81e93c15174a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-284mm" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.846639 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/50cdc3bf-7d9e-4644-87b2-81e93c15174a-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-284mm\" (UID: \"50cdc3bf-7d9e-4644-87b2-81e93c15174a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-284mm" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.892254 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:22 crc kubenswrapper[4936]: I0930 13:50:22.916239 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-ngb5v"] Sep 30 13:50:23 crc kubenswrapper[4936]: I0930 13:50:23.107153 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-547549646b-2g2dl"] Sep 30 13:50:23 crc kubenswrapper[4936]: I0930 13:50:23.116853 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-284mm" Sep 30 13:50:23 crc kubenswrapper[4936]: I0930 13:50:23.219038 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-547549646b-2g2dl" event={"ID":"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc","Type":"ContainerStarted","Data":"c16f27b78eb140c37142f926e7dcecf27de8a0ae35838dfc5468a43f11283bf0"} Sep 30 13:50:23 crc kubenswrapper[4936]: I0930 13:50:23.221531 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-ngb5v" event={"ID":"f690b866-399e-4bf7-bed4-c261098bfbb1","Type":"ContainerStarted","Data":"3f049cb38412f2f454c457f8f7e03773ac5cfe9f78e5e9144894df4555583a5e"} Sep 30 13:50:23 crc kubenswrapper[4936]: I0930 13:50:23.224065 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-lgvdl" event={"ID":"6b791ba1-37e6-440f-899d-7db4972b74f5","Type":"ContainerStarted","Data":"b09cf08039ebbdd3b8d9fd43fd9fd00cbf2274933f66e97474e6d08e3e822922"} Sep 30 13:50:23 crc kubenswrapper[4936]: I0930 13:50:23.225277 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-gj98k" event={"ID":"e80b68f2-d68e-4499-84a4-8a83b18922c6","Type":"ContainerStarted","Data":"e29911e45ae8404c11f6bab9d25bc2b22e0fce3bd362ffb26264d7f98c21b54a"} Sep 30 13:50:23 crc kubenswrapper[4936]: I0930 13:50:23.322265 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-284mm"] Sep 30 13:50:23 crc kubenswrapper[4936]: W0930 13:50:23.334581 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50cdc3bf_7d9e_4644_87b2_81e93c15174a.slice/crio-5f850343b28d5b13b52165858e0eb6b33a332fba81372bd082ac85817135187f WatchSource:0}: Error finding container 5f850343b28d5b13b52165858e0eb6b33a332fba81372bd082ac85817135187f: Status 404 returned error can't find the container with id 5f850343b28d5b13b52165858e0eb6b33a332fba81372bd082ac85817135187f Sep 30 13:50:24 crc kubenswrapper[4936]: I0930 13:50:24.232616 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-284mm" event={"ID":"50cdc3bf-7d9e-4644-87b2-81e93c15174a","Type":"ContainerStarted","Data":"5f850343b28d5b13b52165858e0eb6b33a332fba81372bd082ac85817135187f"} Sep 30 13:50:24 crc kubenswrapper[4936]: I0930 13:50:24.234224 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-547549646b-2g2dl" event={"ID":"2e1df77e-bcc0-4aa8-8fb9-5ed1214460bc","Type":"ContainerStarted","Data":"7ff5c226b936d53d0c20bd2fe36a97d6d78329435eb117f314b4559660fb7482"} Sep 30 13:50:24 crc kubenswrapper[4936]: I0930 13:50:24.272271 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-547549646b-2g2dl" podStartSLOduration=2.272250299 podStartE2EDuration="2.272250299s" podCreationTimestamp="2025-09-30 13:50:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:50:24.269147881 +0000 UTC m=+674.653150202" watchObservedRunningTime="2025-09-30 13:50:24.272250299 +0000 UTC m=+674.656252600" Sep 30 13:50:26 crc kubenswrapper[4936]: I0930 13:50:26.251903 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-284mm" event={"ID":"50cdc3bf-7d9e-4644-87b2-81e93c15174a","Type":"ContainerStarted","Data":"c35eaa4f1ec2b8f0ddd5d262146f4ca844a7e6774ee59eec76f5c11e05f866c0"} Sep 30 13:50:26 crc kubenswrapper[4936]: I0930 13:50:26.254196 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-284mm" Sep 30 13:50:26 crc kubenswrapper[4936]: I0930 13:50:26.266639 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-lgvdl" event={"ID":"6b791ba1-37e6-440f-899d-7db4972b74f5","Type":"ContainerStarted","Data":"67c19c2df1d0fb06b507a75d2c5b6a331e47b0ca18eb21500de6296635d36c8b"} Sep 30 13:50:26 crc kubenswrapper[4936]: I0930 13:50:26.267543 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-lgvdl" Sep 30 13:50:26 crc kubenswrapper[4936]: I0930 13:50:26.270988 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-284mm" podStartSLOduration=1.9023181230000001 podStartE2EDuration="4.270971375s" podCreationTimestamp="2025-09-30 13:50:22 +0000 UTC" firstStartedPulling="2025-09-30 13:50:23.343549503 +0000 UTC m=+673.727551804" lastFinishedPulling="2025-09-30 13:50:25.712202755 +0000 UTC m=+676.096205056" observedRunningTime="2025-09-30 13:50:26.268428343 +0000 UTC m=+676.652430664" watchObservedRunningTime="2025-09-30 13:50:26.270971375 +0000 UTC m=+676.654973666" Sep 30 13:50:26 crc kubenswrapper[4936]: I0930 13:50:26.272926 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-gj98k" event={"ID":"e80b68f2-d68e-4499-84a4-8a83b18922c6","Type":"ContainerStarted","Data":"5f745023b89bfe5713877a9c2d655645594fb6553942963885e8c4d5fa1b683a"} Sep 30 13:50:26 crc kubenswrapper[4936]: I0930 13:50:26.277095 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-ngb5v" event={"ID":"f690b866-399e-4bf7-bed4-c261098bfbb1","Type":"ContainerStarted","Data":"7570bb40330c78a61b5a48851585d5a4dfad9fce47abcaaf8350338da98af541"} Sep 30 13:50:26 crc kubenswrapper[4936]: I0930 13:50:26.305029 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-lgvdl" podStartSLOduration=1.166330218 podStartE2EDuration="4.305011094s" podCreationTimestamp="2025-09-30 13:50:22 +0000 UTC" firstStartedPulling="2025-09-30 13:50:22.563640871 +0000 UTC m=+672.947643172" lastFinishedPulling="2025-09-30 13:50:25.702321757 +0000 UTC m=+676.086324048" observedRunningTime="2025-09-30 13:50:26.298551032 +0000 UTC m=+676.682553333" watchObservedRunningTime="2025-09-30 13:50:26.305011094 +0000 UTC m=+676.689013395" Sep 30 13:50:26 crc kubenswrapper[4936]: I0930 13:50:26.319170 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-ngb5v" podStartSLOduration=1.5561317049999999 podStartE2EDuration="4.319128592s" podCreationTimestamp="2025-09-30 13:50:22 +0000 UTC" firstStartedPulling="2025-09-30 13:50:22.933160586 +0000 UTC m=+673.317162887" lastFinishedPulling="2025-09-30 13:50:25.696157483 +0000 UTC m=+676.080159774" observedRunningTime="2025-09-30 13:50:26.309171992 +0000 UTC m=+676.693174293" watchObservedRunningTime="2025-09-30 13:50:26.319128592 +0000 UTC m=+676.703130893" Sep 30 13:50:29 crc kubenswrapper[4936]: I0930 13:50:29.296584 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-gj98k" event={"ID":"e80b68f2-d68e-4499-84a4-8a83b18922c6","Type":"ContainerStarted","Data":"9e8ddd265fe1d1697170806a30ec253cfb2d5e5615731bcfc8f7554367e9876e"} Sep 30 13:50:29 crc kubenswrapper[4936]: I0930 13:50:29.313269 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-gj98k" podStartSLOduration=1.407253539 podStartE2EDuration="7.313255323s" podCreationTimestamp="2025-09-30 13:50:22 +0000 UTC" firstStartedPulling="2025-09-30 13:50:22.778999731 +0000 UTC m=+673.163002032" lastFinishedPulling="2025-09-30 13:50:28.685001505 +0000 UTC m=+679.069003816" observedRunningTime="2025-09-30 13:50:29.30992918 +0000 UTC m=+679.693931501" watchObservedRunningTime="2025-09-30 13:50:29.313255323 +0000 UTC m=+679.697257624" Sep 30 13:50:32 crc kubenswrapper[4936]: I0930 13:50:32.557539 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-lgvdl" Sep 30 13:50:32 crc kubenswrapper[4936]: I0930 13:50:32.893267 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:32 crc kubenswrapper[4936]: I0930 13:50:32.893595 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:32 crc kubenswrapper[4936]: I0930 13:50:32.897964 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:33 crc kubenswrapper[4936]: I0930 13:50:33.326250 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-547549646b-2g2dl" Sep 30 13:50:33 crc kubenswrapper[4936]: I0930 13:50:33.375910 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-jl85m"] Sep 30 13:50:43 crc kubenswrapper[4936]: I0930 13:50:43.122931 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-284mm" Sep 30 13:50:48 crc kubenswrapper[4936]: I0930 13:50:48.249893 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:50:48 crc kubenswrapper[4936]: I0930 13:50:48.250467 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:50:55 crc kubenswrapper[4936]: I0930 13:50:55.034840 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb"] Sep 30 13:50:55 crc kubenswrapper[4936]: I0930 13:50:55.036412 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb" Sep 30 13:50:55 crc kubenswrapper[4936]: I0930 13:50:55.038403 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 13:50:55 crc kubenswrapper[4936]: I0930 13:50:55.046672 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb"] Sep 30 13:50:55 crc kubenswrapper[4936]: I0930 13:50:55.160027 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b7261a1-f326-4692-ac33-cef53002b4eb-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb\" (UID: \"2b7261a1-f326-4692-ac33-cef53002b4eb\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb" Sep 30 13:50:55 crc kubenswrapper[4936]: I0930 13:50:55.160309 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h454x\" (UniqueName: \"kubernetes.io/projected/2b7261a1-f326-4692-ac33-cef53002b4eb-kube-api-access-h454x\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb\" (UID: \"2b7261a1-f326-4692-ac33-cef53002b4eb\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb" Sep 30 13:50:55 crc kubenswrapper[4936]: I0930 13:50:55.160467 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b7261a1-f326-4692-ac33-cef53002b4eb-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb\" (UID: \"2b7261a1-f326-4692-ac33-cef53002b4eb\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb" Sep 30 13:50:55 crc kubenswrapper[4936]: I0930 13:50:55.262111 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b7261a1-f326-4692-ac33-cef53002b4eb-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb\" (UID: \"2b7261a1-f326-4692-ac33-cef53002b4eb\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb" Sep 30 13:50:55 crc kubenswrapper[4936]: I0930 13:50:55.262169 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h454x\" (UniqueName: \"kubernetes.io/projected/2b7261a1-f326-4692-ac33-cef53002b4eb-kube-api-access-h454x\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb\" (UID: \"2b7261a1-f326-4692-ac33-cef53002b4eb\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb" Sep 30 13:50:55 crc kubenswrapper[4936]: I0930 13:50:55.262247 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b7261a1-f326-4692-ac33-cef53002b4eb-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb\" (UID: \"2b7261a1-f326-4692-ac33-cef53002b4eb\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb" Sep 30 13:50:55 crc kubenswrapper[4936]: I0930 13:50:55.262810 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b7261a1-f326-4692-ac33-cef53002b4eb-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb\" (UID: \"2b7261a1-f326-4692-ac33-cef53002b4eb\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb" Sep 30 13:50:55 crc kubenswrapper[4936]: I0930 13:50:55.263081 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b7261a1-f326-4692-ac33-cef53002b4eb-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb\" (UID: \"2b7261a1-f326-4692-ac33-cef53002b4eb\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb" Sep 30 13:50:55 crc kubenswrapper[4936]: I0930 13:50:55.282853 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h454x\" (UniqueName: \"kubernetes.io/projected/2b7261a1-f326-4692-ac33-cef53002b4eb-kube-api-access-h454x\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb\" (UID: \"2b7261a1-f326-4692-ac33-cef53002b4eb\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb" Sep 30 13:50:55 crc kubenswrapper[4936]: I0930 13:50:55.351396 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb" Sep 30 13:50:55 crc kubenswrapper[4936]: I0930 13:50:55.793758 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb"] Sep 30 13:50:56 crc kubenswrapper[4936]: I0930 13:50:56.479062 4936 generic.go:334] "Generic (PLEG): container finished" podID="2b7261a1-f326-4692-ac33-cef53002b4eb" containerID="6da18511e304be40b5fca04a5595d2aadc80c04100af12f929a300ec82edec4e" exitCode=0 Sep 30 13:50:56 crc kubenswrapper[4936]: I0930 13:50:56.479104 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb" event={"ID":"2b7261a1-f326-4692-ac33-cef53002b4eb","Type":"ContainerDied","Data":"6da18511e304be40b5fca04a5595d2aadc80c04100af12f929a300ec82edec4e"} Sep 30 13:50:56 crc kubenswrapper[4936]: I0930 13:50:56.479137 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb" event={"ID":"2b7261a1-f326-4692-ac33-cef53002b4eb","Type":"ContainerStarted","Data":"18c10c806bcb069c7ffd0446d43c5f02911be5c953ce06b583853b2775fd15bb"} Sep 30 13:50:58 crc kubenswrapper[4936]: I0930 13:50:58.427785 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-jl85m" podUID="c7e5e231-b700-4151-81c8-111a3af3bfc2" containerName="console" containerID="cri-o://d1d7ee47047dc1e771247bd0d411a195500813cfad5616777f316a668721ab3b" gracePeriod=15 Sep 30 13:50:58 crc kubenswrapper[4936]: I0930 13:50:58.492138 4936 generic.go:334] "Generic (PLEG): container finished" podID="2b7261a1-f326-4692-ac33-cef53002b4eb" containerID="d393c30f8add2bdb6dc2d962ae0badf3d92789a278287a67d6d31d78e67bde58" exitCode=0 Sep 30 13:50:58 crc kubenswrapper[4936]: I0930 13:50:58.492183 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb" event={"ID":"2b7261a1-f326-4692-ac33-cef53002b4eb","Type":"ContainerDied","Data":"d393c30f8add2bdb6dc2d962ae0badf3d92789a278287a67d6d31d78e67bde58"} Sep 30 13:50:58 crc kubenswrapper[4936]: I0930 13:50:58.767122 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-jl85m_c7e5e231-b700-4151-81c8-111a3af3bfc2/console/0.log" Sep 30 13:50:58 crc kubenswrapper[4936]: I0930 13:50:58.767459 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:50:58 crc kubenswrapper[4936]: I0930 13:50:58.910132 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7e5e231-b700-4151-81c8-111a3af3bfc2-console-serving-cert\") pod \"c7e5e231-b700-4151-81c8-111a3af3bfc2\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " Sep 30 13:50:58 crc kubenswrapper[4936]: I0930 13:50:58.910238 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7e5e231-b700-4151-81c8-111a3af3bfc2-console-oauth-config\") pod \"c7e5e231-b700-4151-81c8-111a3af3bfc2\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " Sep 30 13:50:58 crc kubenswrapper[4936]: I0930 13:50:58.910291 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf5d5\" (UniqueName: \"kubernetes.io/projected/c7e5e231-b700-4151-81c8-111a3af3bfc2-kube-api-access-pf5d5\") pod \"c7e5e231-b700-4151-81c8-111a3af3bfc2\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " Sep 30 13:50:58 crc kubenswrapper[4936]: I0930 13:50:58.910311 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-service-ca\") pod \"c7e5e231-b700-4151-81c8-111a3af3bfc2\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " Sep 30 13:50:58 crc kubenswrapper[4936]: I0930 13:50:58.910348 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-trusted-ca-bundle\") pod \"c7e5e231-b700-4151-81c8-111a3af3bfc2\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " Sep 30 13:50:58 crc kubenswrapper[4936]: I0930 13:50:58.910363 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-oauth-serving-cert\") pod \"c7e5e231-b700-4151-81c8-111a3af3bfc2\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " Sep 30 13:50:58 crc kubenswrapper[4936]: I0930 13:50:58.910385 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-console-config\") pod \"c7e5e231-b700-4151-81c8-111a3af3bfc2\" (UID: \"c7e5e231-b700-4151-81c8-111a3af3bfc2\") " Sep 30 13:50:58 crc kubenswrapper[4936]: I0930 13:50:58.911202 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-console-config" (OuterVolumeSpecName: "console-config") pod "c7e5e231-b700-4151-81c8-111a3af3bfc2" (UID: "c7e5e231-b700-4151-81c8-111a3af3bfc2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:50:58 crc kubenswrapper[4936]: I0930 13:50:58.911698 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-service-ca" (OuterVolumeSpecName: "service-ca") pod "c7e5e231-b700-4151-81c8-111a3af3bfc2" (UID: "c7e5e231-b700-4151-81c8-111a3af3bfc2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:50:58 crc kubenswrapper[4936]: I0930 13:50:58.911731 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c7e5e231-b700-4151-81c8-111a3af3bfc2" (UID: "c7e5e231-b700-4151-81c8-111a3af3bfc2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:50:58 crc kubenswrapper[4936]: I0930 13:50:58.911789 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c7e5e231-b700-4151-81c8-111a3af3bfc2" (UID: "c7e5e231-b700-4151-81c8-111a3af3bfc2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:50:58 crc kubenswrapper[4936]: I0930 13:50:58.915859 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e5e231-b700-4151-81c8-111a3af3bfc2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c7e5e231-b700-4151-81c8-111a3af3bfc2" (UID: "c7e5e231-b700-4151-81c8-111a3af3bfc2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:50:58 crc kubenswrapper[4936]: I0930 13:50:58.915903 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7e5e231-b700-4151-81c8-111a3af3bfc2-kube-api-access-pf5d5" (OuterVolumeSpecName: "kube-api-access-pf5d5") pod "c7e5e231-b700-4151-81c8-111a3af3bfc2" (UID: "c7e5e231-b700-4151-81c8-111a3af3bfc2"). InnerVolumeSpecName "kube-api-access-pf5d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:50:58 crc kubenswrapper[4936]: I0930 13:50:58.916054 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e5e231-b700-4151-81c8-111a3af3bfc2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c7e5e231-b700-4151-81c8-111a3af3bfc2" (UID: "c7e5e231-b700-4151-81c8-111a3af3bfc2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:50:59 crc kubenswrapper[4936]: I0930 13:50:59.012008 4936 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7e5e231-b700-4151-81c8-111a3af3bfc2-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:50:59 crc kubenswrapper[4936]: I0930 13:50:59.012704 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pf5d5\" (UniqueName: \"kubernetes.io/projected/c7e5e231-b700-4151-81c8-111a3af3bfc2-kube-api-access-pf5d5\") on node \"crc\" DevicePath \"\"" Sep 30 13:50:59 crc kubenswrapper[4936]: I0930 13:50:59.012758 4936 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:50:59 crc kubenswrapper[4936]: I0930 13:50:59.012776 4936 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:50:59 crc kubenswrapper[4936]: I0930 13:50:59.012789 4936 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:50:59 crc kubenswrapper[4936]: I0930 13:50:59.012802 4936 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7e5e231-b700-4151-81c8-111a3af3bfc2-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:50:59 crc kubenswrapper[4936]: I0930 13:50:59.012813 4936 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7e5e231-b700-4151-81c8-111a3af3bfc2-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:50:59 crc kubenswrapper[4936]: I0930 13:50:59.498106 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-jl85m_c7e5e231-b700-4151-81c8-111a3af3bfc2/console/0.log" Sep 30 13:50:59 crc kubenswrapper[4936]: I0930 13:50:59.498500 4936 generic.go:334] "Generic (PLEG): container finished" podID="c7e5e231-b700-4151-81c8-111a3af3bfc2" containerID="d1d7ee47047dc1e771247bd0d411a195500813cfad5616777f316a668721ab3b" exitCode=2 Sep 30 13:50:59 crc kubenswrapper[4936]: I0930 13:50:59.498599 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jl85m" Sep 30 13:50:59 crc kubenswrapper[4936]: I0930 13:50:59.499496 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jl85m" event={"ID":"c7e5e231-b700-4151-81c8-111a3af3bfc2","Type":"ContainerDied","Data":"d1d7ee47047dc1e771247bd0d411a195500813cfad5616777f316a668721ab3b"} Sep 30 13:50:59 crc kubenswrapper[4936]: I0930 13:50:59.499628 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jl85m" event={"ID":"c7e5e231-b700-4151-81c8-111a3af3bfc2","Type":"ContainerDied","Data":"a4068dfc77dfcc96ad8374b6c6fefb3e7dc3264c1810da086406c939573573dc"} Sep 30 13:50:59 crc kubenswrapper[4936]: I0930 13:50:59.499676 4936 scope.go:117] "RemoveContainer" containerID="d1d7ee47047dc1e771247bd0d411a195500813cfad5616777f316a668721ab3b" Sep 30 13:50:59 crc kubenswrapper[4936]: I0930 13:50:59.504802 4936 generic.go:334] "Generic (PLEG): container finished" podID="2b7261a1-f326-4692-ac33-cef53002b4eb" containerID="2c114ca748d00d6b3ce85bcca921f592a97cfc6d692b9393512a1b4ceb2f1598" exitCode=0 Sep 30 13:50:59 crc kubenswrapper[4936]: I0930 13:50:59.504845 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb" event={"ID":"2b7261a1-f326-4692-ac33-cef53002b4eb","Type":"ContainerDied","Data":"2c114ca748d00d6b3ce85bcca921f592a97cfc6d692b9393512a1b4ceb2f1598"} Sep 30 13:50:59 crc kubenswrapper[4936]: I0930 13:50:59.521647 4936 scope.go:117] "RemoveContainer" containerID="d1d7ee47047dc1e771247bd0d411a195500813cfad5616777f316a668721ab3b" Sep 30 13:50:59 crc kubenswrapper[4936]: E0930 13:50:59.523377 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1d7ee47047dc1e771247bd0d411a195500813cfad5616777f316a668721ab3b\": container with ID starting with d1d7ee47047dc1e771247bd0d411a195500813cfad5616777f316a668721ab3b not found: ID does not exist" containerID="d1d7ee47047dc1e771247bd0d411a195500813cfad5616777f316a668721ab3b" Sep 30 13:50:59 crc kubenswrapper[4936]: I0930 13:50:59.523410 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1d7ee47047dc1e771247bd0d411a195500813cfad5616777f316a668721ab3b"} err="failed to get container status \"d1d7ee47047dc1e771247bd0d411a195500813cfad5616777f316a668721ab3b\": rpc error: code = NotFound desc = could not find container \"d1d7ee47047dc1e771247bd0d411a195500813cfad5616777f316a668721ab3b\": container with ID starting with d1d7ee47047dc1e771247bd0d411a195500813cfad5616777f316a668721ab3b not found: ID does not exist" Sep 30 13:50:59 crc kubenswrapper[4936]: I0930 13:50:59.538937 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-jl85m"] Sep 30 13:50:59 crc kubenswrapper[4936]: I0930 13:50:59.542967 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-jl85m"] Sep 30 13:51:00 crc kubenswrapper[4936]: I0930 13:51:00.321280 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7e5e231-b700-4151-81c8-111a3af3bfc2" path="/var/lib/kubelet/pods/c7e5e231-b700-4151-81c8-111a3af3bfc2/volumes" Sep 30 13:51:00 crc kubenswrapper[4936]: I0930 13:51:00.801741 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb" Sep 30 13:51:00 crc kubenswrapper[4936]: I0930 13:51:00.934118 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b7261a1-f326-4692-ac33-cef53002b4eb-bundle\") pod \"2b7261a1-f326-4692-ac33-cef53002b4eb\" (UID: \"2b7261a1-f326-4692-ac33-cef53002b4eb\") " Sep 30 13:51:00 crc kubenswrapper[4936]: I0930 13:51:00.934257 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h454x\" (UniqueName: \"kubernetes.io/projected/2b7261a1-f326-4692-ac33-cef53002b4eb-kube-api-access-h454x\") pod \"2b7261a1-f326-4692-ac33-cef53002b4eb\" (UID: \"2b7261a1-f326-4692-ac33-cef53002b4eb\") " Sep 30 13:51:00 crc kubenswrapper[4936]: I0930 13:51:00.934294 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b7261a1-f326-4692-ac33-cef53002b4eb-util\") pod \"2b7261a1-f326-4692-ac33-cef53002b4eb\" (UID: \"2b7261a1-f326-4692-ac33-cef53002b4eb\") " Sep 30 13:51:00 crc kubenswrapper[4936]: I0930 13:51:00.936215 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b7261a1-f326-4692-ac33-cef53002b4eb-bundle" (OuterVolumeSpecName: "bundle") pod "2b7261a1-f326-4692-ac33-cef53002b4eb" (UID: "2b7261a1-f326-4692-ac33-cef53002b4eb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:51:00 crc kubenswrapper[4936]: I0930 13:51:00.944057 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b7261a1-f326-4692-ac33-cef53002b4eb-kube-api-access-h454x" (OuterVolumeSpecName: "kube-api-access-h454x") pod "2b7261a1-f326-4692-ac33-cef53002b4eb" (UID: "2b7261a1-f326-4692-ac33-cef53002b4eb"). InnerVolumeSpecName "kube-api-access-h454x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:51:00 crc kubenswrapper[4936]: I0930 13:51:00.955300 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b7261a1-f326-4692-ac33-cef53002b4eb-util" (OuterVolumeSpecName: "util") pod "2b7261a1-f326-4692-ac33-cef53002b4eb" (UID: "2b7261a1-f326-4692-ac33-cef53002b4eb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:51:01 crc kubenswrapper[4936]: I0930 13:51:01.038117 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h454x\" (UniqueName: \"kubernetes.io/projected/2b7261a1-f326-4692-ac33-cef53002b4eb-kube-api-access-h454x\") on node \"crc\" DevicePath \"\"" Sep 30 13:51:01 crc kubenswrapper[4936]: I0930 13:51:01.038163 4936 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b7261a1-f326-4692-ac33-cef53002b4eb-util\") on node \"crc\" DevicePath \"\"" Sep 30 13:51:01 crc kubenswrapper[4936]: I0930 13:51:01.038179 4936 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b7261a1-f326-4692-ac33-cef53002b4eb-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:51:01 crc kubenswrapper[4936]: I0930 13:51:01.520556 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb" event={"ID":"2b7261a1-f326-4692-ac33-cef53002b4eb","Type":"ContainerDied","Data":"18c10c806bcb069c7ffd0446d43c5f02911be5c953ce06b583853b2775fd15bb"} Sep 30 13:51:01 crc kubenswrapper[4936]: I0930 13:51:01.520599 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18c10c806bcb069c7ffd0446d43c5f02911be5c953ce06b583853b2775fd15bb" Sep 30 13:51:01 crc kubenswrapper[4936]: I0930 13:51:01.520668 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.389525 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6fd76b6558-4gwpj"] Sep 30 13:51:10 crc kubenswrapper[4936]: E0930 13:51:10.391479 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7261a1-f326-4692-ac33-cef53002b4eb" containerName="pull" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.391585 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7261a1-f326-4692-ac33-cef53002b4eb" containerName="pull" Sep 30 13:51:10 crc kubenswrapper[4936]: E0930 13:51:10.391678 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7261a1-f326-4692-ac33-cef53002b4eb" containerName="extract" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.391751 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7261a1-f326-4692-ac33-cef53002b4eb" containerName="extract" Sep 30 13:51:10 crc kubenswrapper[4936]: E0930 13:51:10.391826 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7261a1-f326-4692-ac33-cef53002b4eb" containerName="util" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.391906 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7261a1-f326-4692-ac33-cef53002b4eb" containerName="util" Sep 30 13:51:10 crc kubenswrapper[4936]: E0930 13:51:10.391997 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e5e231-b700-4151-81c8-111a3af3bfc2" containerName="console" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.392066 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e5e231-b700-4151-81c8-111a3af3bfc2" containerName="console" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.392250 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e5e231-b700-4151-81c8-111a3af3bfc2" containerName="console" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.392353 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b7261a1-f326-4692-ac33-cef53002b4eb" containerName="extract" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.392921 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6fd76b6558-4gwpj" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.400174 4936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.400783 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.403650 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.404180 4936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-gfxw2" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.408125 4936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.427824 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6fd76b6558-4gwpj"] Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.443950 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d4775401-fba2-4958-b075-6862db490e18-apiservice-cert\") pod \"metallb-operator-controller-manager-6fd76b6558-4gwpj\" (UID: \"d4775401-fba2-4958-b075-6862db490e18\") " pod="metallb-system/metallb-operator-controller-manager-6fd76b6558-4gwpj" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.444012 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d4775401-fba2-4958-b075-6862db490e18-webhook-cert\") pod \"metallb-operator-controller-manager-6fd76b6558-4gwpj\" (UID: \"d4775401-fba2-4958-b075-6862db490e18\") " pod="metallb-system/metallb-operator-controller-manager-6fd76b6558-4gwpj" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.444049 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22ttx\" (UniqueName: \"kubernetes.io/projected/d4775401-fba2-4958-b075-6862db490e18-kube-api-access-22ttx\") pod \"metallb-operator-controller-manager-6fd76b6558-4gwpj\" (UID: \"d4775401-fba2-4958-b075-6862db490e18\") " pod="metallb-system/metallb-operator-controller-manager-6fd76b6558-4gwpj" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.546417 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d4775401-fba2-4958-b075-6862db490e18-apiservice-cert\") pod \"metallb-operator-controller-manager-6fd76b6558-4gwpj\" (UID: \"d4775401-fba2-4958-b075-6862db490e18\") " pod="metallb-system/metallb-operator-controller-manager-6fd76b6558-4gwpj" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.546813 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d4775401-fba2-4958-b075-6862db490e18-webhook-cert\") pod \"metallb-operator-controller-manager-6fd76b6558-4gwpj\" (UID: \"d4775401-fba2-4958-b075-6862db490e18\") " pod="metallb-system/metallb-operator-controller-manager-6fd76b6558-4gwpj" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.546923 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22ttx\" (UniqueName: \"kubernetes.io/projected/d4775401-fba2-4958-b075-6862db490e18-kube-api-access-22ttx\") pod \"metallb-operator-controller-manager-6fd76b6558-4gwpj\" (UID: \"d4775401-fba2-4958-b075-6862db490e18\") " pod="metallb-system/metallb-operator-controller-manager-6fd76b6558-4gwpj" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.556282 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d4775401-fba2-4958-b075-6862db490e18-webhook-cert\") pod \"metallb-operator-controller-manager-6fd76b6558-4gwpj\" (UID: \"d4775401-fba2-4958-b075-6862db490e18\") " pod="metallb-system/metallb-operator-controller-manager-6fd76b6558-4gwpj" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.575765 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d4775401-fba2-4958-b075-6862db490e18-apiservice-cert\") pod \"metallb-operator-controller-manager-6fd76b6558-4gwpj\" (UID: \"d4775401-fba2-4958-b075-6862db490e18\") " pod="metallb-system/metallb-operator-controller-manager-6fd76b6558-4gwpj" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.579248 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22ttx\" (UniqueName: \"kubernetes.io/projected/d4775401-fba2-4958-b075-6862db490e18-kube-api-access-22ttx\") pod \"metallb-operator-controller-manager-6fd76b6558-4gwpj\" (UID: \"d4775401-fba2-4958-b075-6862db490e18\") " pod="metallb-system/metallb-operator-controller-manager-6fd76b6558-4gwpj" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.708783 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6fd76b6558-4gwpj" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.840233 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-c67dfbd86-j4s2n"] Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.841622 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-c67dfbd86-j4s2n" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.845120 4936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.846910 4936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-8mr2z" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.847043 4936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.861415 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-c67dfbd86-j4s2n"] Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.957455 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab06cf8d-01c1-45c8-9c95-6f3369b8ef75-webhook-cert\") pod \"metallb-operator-webhook-server-c67dfbd86-j4s2n\" (UID: \"ab06cf8d-01c1-45c8-9c95-6f3369b8ef75\") " pod="metallb-system/metallb-operator-webhook-server-c67dfbd86-j4s2n" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.957505 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwqpw\" (UniqueName: \"kubernetes.io/projected/ab06cf8d-01c1-45c8-9c95-6f3369b8ef75-kube-api-access-fwqpw\") pod \"metallb-operator-webhook-server-c67dfbd86-j4s2n\" (UID: \"ab06cf8d-01c1-45c8-9c95-6f3369b8ef75\") " pod="metallb-system/metallb-operator-webhook-server-c67dfbd86-j4s2n" Sep 30 13:51:10 crc kubenswrapper[4936]: I0930 13:51:10.957541 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab06cf8d-01c1-45c8-9c95-6f3369b8ef75-apiservice-cert\") pod \"metallb-operator-webhook-server-c67dfbd86-j4s2n\" (UID: \"ab06cf8d-01c1-45c8-9c95-6f3369b8ef75\") " pod="metallb-system/metallb-operator-webhook-server-c67dfbd86-j4s2n" Sep 30 13:51:11 crc kubenswrapper[4936]: I0930 13:51:11.060132 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab06cf8d-01c1-45c8-9c95-6f3369b8ef75-webhook-cert\") pod \"metallb-operator-webhook-server-c67dfbd86-j4s2n\" (UID: \"ab06cf8d-01c1-45c8-9c95-6f3369b8ef75\") " pod="metallb-system/metallb-operator-webhook-server-c67dfbd86-j4s2n" Sep 30 13:51:11 crc kubenswrapper[4936]: I0930 13:51:11.060178 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwqpw\" (UniqueName: \"kubernetes.io/projected/ab06cf8d-01c1-45c8-9c95-6f3369b8ef75-kube-api-access-fwqpw\") pod \"metallb-operator-webhook-server-c67dfbd86-j4s2n\" (UID: \"ab06cf8d-01c1-45c8-9c95-6f3369b8ef75\") " pod="metallb-system/metallb-operator-webhook-server-c67dfbd86-j4s2n" Sep 30 13:51:11 crc kubenswrapper[4936]: I0930 13:51:11.060214 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab06cf8d-01c1-45c8-9c95-6f3369b8ef75-apiservice-cert\") pod \"metallb-operator-webhook-server-c67dfbd86-j4s2n\" (UID: \"ab06cf8d-01c1-45c8-9c95-6f3369b8ef75\") " pod="metallb-system/metallb-operator-webhook-server-c67dfbd86-j4s2n" Sep 30 13:51:11 crc kubenswrapper[4936]: I0930 13:51:11.070249 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab06cf8d-01c1-45c8-9c95-6f3369b8ef75-apiservice-cert\") pod \"metallb-operator-webhook-server-c67dfbd86-j4s2n\" (UID: \"ab06cf8d-01c1-45c8-9c95-6f3369b8ef75\") " pod="metallb-system/metallb-operator-webhook-server-c67dfbd86-j4s2n" Sep 30 13:51:11 crc kubenswrapper[4936]: I0930 13:51:11.070412 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab06cf8d-01c1-45c8-9c95-6f3369b8ef75-webhook-cert\") pod \"metallb-operator-webhook-server-c67dfbd86-j4s2n\" (UID: \"ab06cf8d-01c1-45c8-9c95-6f3369b8ef75\") " pod="metallb-system/metallb-operator-webhook-server-c67dfbd86-j4s2n" Sep 30 13:51:11 crc kubenswrapper[4936]: I0930 13:51:11.098029 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwqpw\" (UniqueName: \"kubernetes.io/projected/ab06cf8d-01c1-45c8-9c95-6f3369b8ef75-kube-api-access-fwqpw\") pod \"metallb-operator-webhook-server-c67dfbd86-j4s2n\" (UID: \"ab06cf8d-01c1-45c8-9c95-6f3369b8ef75\") " pod="metallb-system/metallb-operator-webhook-server-c67dfbd86-j4s2n" Sep 30 13:51:11 crc kubenswrapper[4936]: I0930 13:51:11.128857 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6fd76b6558-4gwpj"] Sep 30 13:51:11 crc kubenswrapper[4936]: W0930 13:51:11.133779 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4775401_fba2_4958_b075_6862db490e18.slice/crio-4c7e24816134d61ed220d82c40fdebe680e9c3d83ea408f9bdb0eded8f243ad3 WatchSource:0}: Error finding container 4c7e24816134d61ed220d82c40fdebe680e9c3d83ea408f9bdb0eded8f243ad3: Status 404 returned error can't find the container with id 4c7e24816134d61ed220d82c40fdebe680e9c3d83ea408f9bdb0eded8f243ad3 Sep 30 13:51:11 crc kubenswrapper[4936]: I0930 13:51:11.188028 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-c67dfbd86-j4s2n" Sep 30 13:51:11 crc kubenswrapper[4936]: I0930 13:51:11.464895 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-c67dfbd86-j4s2n"] Sep 30 13:51:11 crc kubenswrapper[4936]: I0930 13:51:11.586821 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-c67dfbd86-j4s2n" event={"ID":"ab06cf8d-01c1-45c8-9c95-6f3369b8ef75","Type":"ContainerStarted","Data":"5df69f41fc4ba1f8a89a29f8a1446eaafa31d3523d43a63a80fb3486d67d44ac"} Sep 30 13:51:11 crc kubenswrapper[4936]: I0930 13:51:11.587729 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6fd76b6558-4gwpj" event={"ID":"d4775401-fba2-4958-b075-6862db490e18","Type":"ContainerStarted","Data":"4c7e24816134d61ed220d82c40fdebe680e9c3d83ea408f9bdb0eded8f243ad3"} Sep 30 13:51:14 crc kubenswrapper[4936]: I0930 13:51:14.605023 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6fd76b6558-4gwpj" event={"ID":"d4775401-fba2-4958-b075-6862db490e18","Type":"ContainerStarted","Data":"613b1fd58bee6edfce25b862ce239924f15d7739f8dec2c5ff99cd21b0d63b67"} Sep 30 13:51:14 crc kubenswrapper[4936]: I0930 13:51:14.605654 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6fd76b6558-4gwpj" Sep 30 13:51:14 crc kubenswrapper[4936]: I0930 13:51:14.638056 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6fd76b6558-4gwpj" podStartSLOduration=1.618508278 podStartE2EDuration="4.638037461s" podCreationTimestamp="2025-09-30 13:51:10 +0000 UTC" firstStartedPulling="2025-09-30 13:51:11.136009194 +0000 UTC m=+721.520011495" lastFinishedPulling="2025-09-30 13:51:14.155538377 +0000 UTC m=+724.539540678" observedRunningTime="2025-09-30 13:51:14.636554589 +0000 UTC m=+725.020556900" watchObservedRunningTime="2025-09-30 13:51:14.638037461 +0000 UTC m=+725.022039762" Sep 30 13:51:17 crc kubenswrapper[4936]: I0930 13:51:17.621124 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-c67dfbd86-j4s2n" event={"ID":"ab06cf8d-01c1-45c8-9c95-6f3369b8ef75","Type":"ContainerStarted","Data":"5982e4798b8d49b704e31af5867a8ca924082cff0191e607a292c22754e175ea"} Sep 30 13:51:17 crc kubenswrapper[4936]: I0930 13:51:17.621511 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-c67dfbd86-j4s2n" Sep 30 13:51:17 crc kubenswrapper[4936]: I0930 13:51:17.645695 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-c67dfbd86-j4s2n" podStartSLOduration=1.9395066650000001 podStartE2EDuration="7.645674976s" podCreationTimestamp="2025-09-30 13:51:10 +0000 UTC" firstStartedPulling="2025-09-30 13:51:11.475086362 +0000 UTC m=+721.859088663" lastFinishedPulling="2025-09-30 13:51:17.181254673 +0000 UTC m=+727.565256974" observedRunningTime="2025-09-30 13:51:17.63738253 +0000 UTC m=+728.021384831" watchObservedRunningTime="2025-09-30 13:51:17.645674976 +0000 UTC m=+728.029677277" Sep 30 13:51:18 crc kubenswrapper[4936]: I0930 13:51:18.250468 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:51:18 crc kubenswrapper[4936]: I0930 13:51:18.250543 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:51:28 crc kubenswrapper[4936]: I0930 13:51:28.810079 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-djmjn"] Sep 30 13:51:28 crc kubenswrapper[4936]: I0930 13:51:28.810800 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" podUID="836a4387-b928-437f-a758-289ece3ff594" containerName="controller-manager" containerID="cri-o://b05225a9e8ebc128b4f49bc14bac367033217200e5f2fca853d0b8bfae36e678" gracePeriod=30 Sep 30 13:51:28 crc kubenswrapper[4936]: I0930 13:51:28.837172 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl"] Sep 30 13:51:28 crc kubenswrapper[4936]: I0930 13:51:28.837630 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" podUID="b58ad99c-bbed-4e80-9cc1-f281c2072fbf" containerName="route-controller-manager" containerID="cri-o://45787c0543b2f23cbabb613351aead176bf92f29aea11f0b96e76e9661dbc522" gracePeriod=30 Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.398231 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.407214 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.515170 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-client-ca\") pod \"b58ad99c-bbed-4e80-9cc1-f281c2072fbf\" (UID: \"b58ad99c-bbed-4e80-9cc1-f281c2072fbf\") " Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.515253 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/836a4387-b928-437f-a758-289ece3ff594-client-ca\") pod \"836a4387-b928-437f-a758-289ece3ff594\" (UID: \"836a4387-b928-437f-a758-289ece3ff594\") " Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.515308 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/836a4387-b928-437f-a758-289ece3ff594-serving-cert\") pod \"836a4387-b928-437f-a758-289ece3ff594\" (UID: \"836a4387-b928-437f-a758-289ece3ff594\") " Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.515326 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/836a4387-b928-437f-a758-289ece3ff594-config\") pod \"836a4387-b928-437f-a758-289ece3ff594\" (UID: \"836a4387-b928-437f-a758-289ece3ff594\") " Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.515371 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnrcp\" (UniqueName: \"kubernetes.io/projected/836a4387-b928-437f-a758-289ece3ff594-kube-api-access-wnrcp\") pod \"836a4387-b928-437f-a758-289ece3ff594\" (UID: \"836a4387-b928-437f-a758-289ece3ff594\") " Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.515399 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w5j7\" (UniqueName: \"kubernetes.io/projected/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-kube-api-access-5w5j7\") pod \"b58ad99c-bbed-4e80-9cc1-f281c2072fbf\" (UID: \"b58ad99c-bbed-4e80-9cc1-f281c2072fbf\") " Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.515439 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/836a4387-b928-437f-a758-289ece3ff594-proxy-ca-bundles\") pod \"836a4387-b928-437f-a758-289ece3ff594\" (UID: \"836a4387-b928-437f-a758-289ece3ff594\") " Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.515470 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-config\") pod \"b58ad99c-bbed-4e80-9cc1-f281c2072fbf\" (UID: \"b58ad99c-bbed-4e80-9cc1-f281c2072fbf\") " Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.515487 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-serving-cert\") pod \"b58ad99c-bbed-4e80-9cc1-f281c2072fbf\" (UID: \"b58ad99c-bbed-4e80-9cc1-f281c2072fbf\") " Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.515968 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-client-ca" (OuterVolumeSpecName: "client-ca") pod "b58ad99c-bbed-4e80-9cc1-f281c2072fbf" (UID: "b58ad99c-bbed-4e80-9cc1-f281c2072fbf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.516140 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/836a4387-b928-437f-a758-289ece3ff594-client-ca" (OuterVolumeSpecName: "client-ca") pod "836a4387-b928-437f-a758-289ece3ff594" (UID: "836a4387-b928-437f-a758-289ece3ff594"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.517087 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/836a4387-b928-437f-a758-289ece3ff594-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "836a4387-b928-437f-a758-289ece3ff594" (UID: "836a4387-b928-437f-a758-289ece3ff594"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.517119 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/836a4387-b928-437f-a758-289ece3ff594-config" (OuterVolumeSpecName: "config") pod "836a4387-b928-437f-a758-289ece3ff594" (UID: "836a4387-b928-437f-a758-289ece3ff594"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.517197 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-config" (OuterVolumeSpecName: "config") pod "b58ad99c-bbed-4e80-9cc1-f281c2072fbf" (UID: "b58ad99c-bbed-4e80-9cc1-f281c2072fbf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.524790 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/836a4387-b928-437f-a758-289ece3ff594-kube-api-access-wnrcp" (OuterVolumeSpecName: "kube-api-access-wnrcp") pod "836a4387-b928-437f-a758-289ece3ff594" (UID: "836a4387-b928-437f-a758-289ece3ff594"). InnerVolumeSpecName "kube-api-access-wnrcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.525529 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-kube-api-access-5w5j7" (OuterVolumeSpecName: "kube-api-access-5w5j7") pod "b58ad99c-bbed-4e80-9cc1-f281c2072fbf" (UID: "b58ad99c-bbed-4e80-9cc1-f281c2072fbf"). InnerVolumeSpecName "kube-api-access-5w5j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.525576 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b58ad99c-bbed-4e80-9cc1-f281c2072fbf" (UID: "b58ad99c-bbed-4e80-9cc1-f281c2072fbf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.525582 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/836a4387-b928-437f-a758-289ece3ff594-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "836a4387-b928-437f-a758-289ece3ff594" (UID: "836a4387-b928-437f-a758-289ece3ff594"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.616903 4936 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.616938 4936 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/836a4387-b928-437f-a758-289ece3ff594-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.616949 4936 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/836a4387-b928-437f-a758-289ece3ff594-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.616961 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/836a4387-b928-437f-a758-289ece3ff594-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.616974 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnrcp\" (UniqueName: \"kubernetes.io/projected/836a4387-b928-437f-a758-289ece3ff594-kube-api-access-wnrcp\") on node \"crc\" DevicePath \"\"" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.616985 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w5j7\" (UniqueName: \"kubernetes.io/projected/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-kube-api-access-5w5j7\") on node \"crc\" DevicePath \"\"" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.616992 4936 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/836a4387-b928-437f-a758-289ece3ff594-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.617001 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.617010 4936 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b58ad99c-bbed-4e80-9cc1-f281c2072fbf-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.682511 4936 generic.go:334] "Generic (PLEG): container finished" podID="836a4387-b928-437f-a758-289ece3ff594" containerID="b05225a9e8ebc128b4f49bc14bac367033217200e5f2fca853d0b8bfae36e678" exitCode=0 Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.682561 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" event={"ID":"836a4387-b928-437f-a758-289ece3ff594","Type":"ContainerDied","Data":"b05225a9e8ebc128b4f49bc14bac367033217200e5f2fca853d0b8bfae36e678"} Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.682596 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.682611 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-djmjn" event={"ID":"836a4387-b928-437f-a758-289ece3ff594","Type":"ContainerDied","Data":"fdac9b475d4bfa00edf922ecf6613e603d39a8446fe99913f7198d7278067bcc"} Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.682635 4936 scope.go:117] "RemoveContainer" containerID="b05225a9e8ebc128b4f49bc14bac367033217200e5f2fca853d0b8bfae36e678" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.684811 4936 generic.go:334] "Generic (PLEG): container finished" podID="b58ad99c-bbed-4e80-9cc1-f281c2072fbf" containerID="45787c0543b2f23cbabb613351aead176bf92f29aea11f0b96e76e9661dbc522" exitCode=0 Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.684853 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" event={"ID":"b58ad99c-bbed-4e80-9cc1-f281c2072fbf","Type":"ContainerDied","Data":"45787c0543b2f23cbabb613351aead176bf92f29aea11f0b96e76e9661dbc522"} Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.684878 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" event={"ID":"b58ad99c-bbed-4e80-9cc1-f281c2072fbf","Type":"ContainerDied","Data":"703f2d9448759efcd75cda5de1fb30a5a14cd0b9689a04ff9298e198d297f077"} Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.684933 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.697614 4936 scope.go:117] "RemoveContainer" containerID="b05225a9e8ebc128b4f49bc14bac367033217200e5f2fca853d0b8bfae36e678" Sep 30 13:51:29 crc kubenswrapper[4936]: E0930 13:51:29.698150 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b05225a9e8ebc128b4f49bc14bac367033217200e5f2fca853d0b8bfae36e678\": container with ID starting with b05225a9e8ebc128b4f49bc14bac367033217200e5f2fca853d0b8bfae36e678 not found: ID does not exist" containerID="b05225a9e8ebc128b4f49bc14bac367033217200e5f2fca853d0b8bfae36e678" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.698194 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b05225a9e8ebc128b4f49bc14bac367033217200e5f2fca853d0b8bfae36e678"} err="failed to get container status \"b05225a9e8ebc128b4f49bc14bac367033217200e5f2fca853d0b8bfae36e678\": rpc error: code = NotFound desc = could not find container \"b05225a9e8ebc128b4f49bc14bac367033217200e5f2fca853d0b8bfae36e678\": container with ID starting with b05225a9e8ebc128b4f49bc14bac367033217200e5f2fca853d0b8bfae36e678 not found: ID does not exist" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.698221 4936 scope.go:117] "RemoveContainer" containerID="45787c0543b2f23cbabb613351aead176bf92f29aea11f0b96e76e9661dbc522" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.715718 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-djmjn"] Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.716241 4936 scope.go:117] "RemoveContainer" containerID="45787c0543b2f23cbabb613351aead176bf92f29aea11f0b96e76e9661dbc522" Sep 30 13:51:29 crc kubenswrapper[4936]: E0930 13:51:29.716961 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45787c0543b2f23cbabb613351aead176bf92f29aea11f0b96e76e9661dbc522\": container with ID starting with 45787c0543b2f23cbabb613351aead176bf92f29aea11f0b96e76e9661dbc522 not found: ID does not exist" containerID="45787c0543b2f23cbabb613351aead176bf92f29aea11f0b96e76e9661dbc522" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.717079 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45787c0543b2f23cbabb613351aead176bf92f29aea11f0b96e76e9661dbc522"} err="failed to get container status \"45787c0543b2f23cbabb613351aead176bf92f29aea11f0b96e76e9661dbc522\": rpc error: code = NotFound desc = could not find container \"45787c0543b2f23cbabb613351aead176bf92f29aea11f0b96e76e9661dbc522\": container with ID starting with 45787c0543b2f23cbabb613351aead176bf92f29aea11f0b96e76e9661dbc522 not found: ID does not exist" Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.720477 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-djmjn"] Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.727977 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl"] Sep 30 13:51:29 crc kubenswrapper[4936]: I0930 13:51:29.733200 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gd7hl"] Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.321132 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="836a4387-b928-437f-a758-289ece3ff594" path="/var/lib/kubelet/pods/836a4387-b928-437f-a758-289ece3ff594/volumes" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.322722 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b58ad99c-bbed-4e80-9cc1-f281c2072fbf" path="/var/lib/kubelet/pods/b58ad99c-bbed-4e80-9cc1-f281c2072fbf/volumes" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.530774 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb"] Sep 30 13:51:30 crc kubenswrapper[4936]: E0930 13:51:30.531174 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="836a4387-b928-437f-a758-289ece3ff594" containerName="controller-manager" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.531239 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="836a4387-b928-437f-a758-289ece3ff594" containerName="controller-manager" Sep 30 13:51:30 crc kubenswrapper[4936]: E0930 13:51:30.531301 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b58ad99c-bbed-4e80-9cc1-f281c2072fbf" containerName="route-controller-manager" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.531370 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b58ad99c-bbed-4e80-9cc1-f281c2072fbf" containerName="route-controller-manager" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.531541 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="836a4387-b928-437f-a758-289ece3ff594" containerName="controller-manager" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.531609 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b58ad99c-bbed-4e80-9cc1-f281c2072fbf" containerName="route-controller-manager" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.532105 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.534811 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.535713 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75fc68db85-5m8jw"] Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.535900 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.536186 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.536583 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.539894 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.543834 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.544106 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.544030 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.547617 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.547802 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.548248 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.548318 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.548392 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.556328 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb"] Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.559214 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.560105 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75fc68db85-5m8jw"] Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.628006 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0e36b-bae7-4a81-a430-321127c6e90f-serving-cert\") pod \"controller-manager-75fc68db85-5m8jw\" (UID: \"bbf0e36b-bae7-4a81-a430-321127c6e90f\") " pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.628072 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrs8h\" (UniqueName: \"kubernetes.io/projected/b987db6d-f40c-4854-bc75-fe3b1c1331f1-kube-api-access-rrs8h\") pod \"route-controller-manager-6cdb7488fd-kvxmb\" (UID: \"b987db6d-f40c-4854-bc75-fe3b1c1331f1\") " pod="openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.628115 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbf0e36b-bae7-4a81-a430-321127c6e90f-client-ca\") pod \"controller-manager-75fc68db85-5m8jw\" (UID: \"bbf0e36b-bae7-4a81-a430-321127c6e90f\") " pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.628145 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm5qf\" (UniqueName: \"kubernetes.io/projected/bbf0e36b-bae7-4a81-a430-321127c6e90f-kube-api-access-zm5qf\") pod \"controller-manager-75fc68db85-5m8jw\" (UID: \"bbf0e36b-bae7-4a81-a430-321127c6e90f\") " pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.628181 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbf0e36b-bae7-4a81-a430-321127c6e90f-proxy-ca-bundles\") pod \"controller-manager-75fc68db85-5m8jw\" (UID: \"bbf0e36b-bae7-4a81-a430-321127c6e90f\") " pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.628216 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf0e36b-bae7-4a81-a430-321127c6e90f-config\") pod \"controller-manager-75fc68db85-5m8jw\" (UID: \"bbf0e36b-bae7-4a81-a430-321127c6e90f\") " pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.628249 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b987db6d-f40c-4854-bc75-fe3b1c1331f1-client-ca\") pod \"route-controller-manager-6cdb7488fd-kvxmb\" (UID: \"b987db6d-f40c-4854-bc75-fe3b1c1331f1\") " pod="openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.628280 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b987db6d-f40c-4854-bc75-fe3b1c1331f1-serving-cert\") pod \"route-controller-manager-6cdb7488fd-kvxmb\" (UID: \"b987db6d-f40c-4854-bc75-fe3b1c1331f1\") " pod="openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.628312 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b987db6d-f40c-4854-bc75-fe3b1c1331f1-config\") pod \"route-controller-manager-6cdb7488fd-kvxmb\" (UID: \"b987db6d-f40c-4854-bc75-fe3b1c1331f1\") " pod="openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.729374 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b987db6d-f40c-4854-bc75-fe3b1c1331f1-config\") pod \"route-controller-manager-6cdb7488fd-kvxmb\" (UID: \"b987db6d-f40c-4854-bc75-fe3b1c1331f1\") " pod="openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.729434 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0e36b-bae7-4a81-a430-321127c6e90f-serving-cert\") pod \"controller-manager-75fc68db85-5m8jw\" (UID: \"bbf0e36b-bae7-4a81-a430-321127c6e90f\") " pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.729459 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrs8h\" (UniqueName: \"kubernetes.io/projected/b987db6d-f40c-4854-bc75-fe3b1c1331f1-kube-api-access-rrs8h\") pod \"route-controller-manager-6cdb7488fd-kvxmb\" (UID: \"b987db6d-f40c-4854-bc75-fe3b1c1331f1\") " pod="openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.729485 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbf0e36b-bae7-4a81-a430-321127c6e90f-client-ca\") pod \"controller-manager-75fc68db85-5m8jw\" (UID: \"bbf0e36b-bae7-4a81-a430-321127c6e90f\") " pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.729504 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm5qf\" (UniqueName: \"kubernetes.io/projected/bbf0e36b-bae7-4a81-a430-321127c6e90f-kube-api-access-zm5qf\") pod \"controller-manager-75fc68db85-5m8jw\" (UID: \"bbf0e36b-bae7-4a81-a430-321127c6e90f\") " pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.729520 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbf0e36b-bae7-4a81-a430-321127c6e90f-proxy-ca-bundles\") pod \"controller-manager-75fc68db85-5m8jw\" (UID: \"bbf0e36b-bae7-4a81-a430-321127c6e90f\") " pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.729539 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf0e36b-bae7-4a81-a430-321127c6e90f-config\") pod \"controller-manager-75fc68db85-5m8jw\" (UID: \"bbf0e36b-bae7-4a81-a430-321127c6e90f\") " pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.729561 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b987db6d-f40c-4854-bc75-fe3b1c1331f1-client-ca\") pod \"route-controller-manager-6cdb7488fd-kvxmb\" (UID: \"b987db6d-f40c-4854-bc75-fe3b1c1331f1\") " pod="openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.729583 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b987db6d-f40c-4854-bc75-fe3b1c1331f1-serving-cert\") pod \"route-controller-manager-6cdb7488fd-kvxmb\" (UID: \"b987db6d-f40c-4854-bc75-fe3b1c1331f1\") " pod="openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.731734 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b987db6d-f40c-4854-bc75-fe3b1c1331f1-config\") pod \"route-controller-manager-6cdb7488fd-kvxmb\" (UID: \"b987db6d-f40c-4854-bc75-fe3b1c1331f1\") " pod="openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.732008 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbf0e36b-bae7-4a81-a430-321127c6e90f-proxy-ca-bundles\") pod \"controller-manager-75fc68db85-5m8jw\" (UID: \"bbf0e36b-bae7-4a81-a430-321127c6e90f\") " pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.733162 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b987db6d-f40c-4854-bc75-fe3b1c1331f1-client-ca\") pod \"route-controller-manager-6cdb7488fd-kvxmb\" (UID: \"b987db6d-f40c-4854-bc75-fe3b1c1331f1\") " pod="openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.733443 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf0e36b-bae7-4a81-a430-321127c6e90f-config\") pod \"controller-manager-75fc68db85-5m8jw\" (UID: \"bbf0e36b-bae7-4a81-a430-321127c6e90f\") " pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.734051 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbf0e36b-bae7-4a81-a430-321127c6e90f-client-ca\") pod \"controller-manager-75fc68db85-5m8jw\" (UID: \"bbf0e36b-bae7-4a81-a430-321127c6e90f\") " pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.748588 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm5qf\" (UniqueName: \"kubernetes.io/projected/bbf0e36b-bae7-4a81-a430-321127c6e90f-kube-api-access-zm5qf\") pod \"controller-manager-75fc68db85-5m8jw\" (UID: \"bbf0e36b-bae7-4a81-a430-321127c6e90f\") " pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.749005 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0e36b-bae7-4a81-a430-321127c6e90f-serving-cert\") pod \"controller-manager-75fc68db85-5m8jw\" (UID: \"bbf0e36b-bae7-4a81-a430-321127c6e90f\") " pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.749406 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b987db6d-f40c-4854-bc75-fe3b1c1331f1-serving-cert\") pod \"route-controller-manager-6cdb7488fd-kvxmb\" (UID: \"b987db6d-f40c-4854-bc75-fe3b1c1331f1\") " pod="openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.753095 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrs8h\" (UniqueName: \"kubernetes.io/projected/b987db6d-f40c-4854-bc75-fe3b1c1331f1-kube-api-access-rrs8h\") pod \"route-controller-manager-6cdb7488fd-kvxmb\" (UID: \"b987db6d-f40c-4854-bc75-fe3b1c1331f1\") " pod="openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.847532 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb" Sep 30 13:51:30 crc kubenswrapper[4936]: I0930 13:51:30.863867 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" Sep 30 13:51:31 crc kubenswrapper[4936]: I0930 13:51:31.098207 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75fc68db85-5m8jw"] Sep 30 13:51:31 crc kubenswrapper[4936]: W0930 13:51:31.128904 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbf0e36b_bae7_4a81_a430_321127c6e90f.slice/crio-1bc69ff491d3abaea795bb0e158e4daf212c65f3d4e8ec2a2d83cf7034e00fdd WatchSource:0}: Error finding container 1bc69ff491d3abaea795bb0e158e4daf212c65f3d4e8ec2a2d83cf7034e00fdd: Status 404 returned error can't find the container with id 1bc69ff491d3abaea795bb0e158e4daf212c65f3d4e8ec2a2d83cf7034e00fdd Sep 30 13:51:31 crc kubenswrapper[4936]: I0930 13:51:31.187257 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb"] Sep 30 13:51:31 crc kubenswrapper[4936]: I0930 13:51:31.208094 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-c67dfbd86-j4s2n" Sep 30 13:51:31 crc kubenswrapper[4936]: I0930 13:51:31.700279 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb" event={"ID":"b987db6d-f40c-4854-bc75-fe3b1c1331f1","Type":"ContainerStarted","Data":"d4f07129575c18261908ae39a4a268db4bc4b4eece488b158450f3f787716b26"} Sep 30 13:51:31 crc kubenswrapper[4936]: I0930 13:51:31.702385 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" event={"ID":"bbf0e36b-bae7-4a81-a430-321127c6e90f","Type":"ContainerStarted","Data":"1bc69ff491d3abaea795bb0e158e4daf212c65f3d4e8ec2a2d83cf7034e00fdd"} Sep 30 13:51:32 crc kubenswrapper[4936]: I0930 13:51:32.709019 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb" event={"ID":"b987db6d-f40c-4854-bc75-fe3b1c1331f1","Type":"ContainerStarted","Data":"c191826cb2a14c0947d34db12f549c4c7d333d525640aa88944b20e59d0c8be6"} Sep 30 13:51:33 crc kubenswrapper[4936]: I0930 13:51:33.756157 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" event={"ID":"bbf0e36b-bae7-4a81-a430-321127c6e90f","Type":"ContainerStarted","Data":"61993cf12db4ca0cfde9ee08e8bf483714c5eefcb96b4a4eab54db84b514027c"} Sep 30 13:51:33 crc kubenswrapper[4936]: I0930 13:51:33.756201 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" Sep 30 13:51:33 crc kubenswrapper[4936]: I0930 13:51:33.756212 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb" Sep 30 13:51:33 crc kubenswrapper[4936]: I0930 13:51:33.763516 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb" Sep 30 13:51:33 crc kubenswrapper[4936]: I0930 13:51:33.764495 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" Sep 30 13:51:33 crc kubenswrapper[4936]: I0930 13:51:33.817551 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6cdb7488fd-kvxmb" podStartSLOduration=4.817414795 podStartE2EDuration="4.817414795s" podCreationTimestamp="2025-09-30 13:51:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:51:33.790768039 +0000 UTC m=+744.174770340" watchObservedRunningTime="2025-09-30 13:51:33.817414795 +0000 UTC m=+744.201417096" Sep 30 13:51:33 crc kubenswrapper[4936]: I0930 13:51:33.846883 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75fc68db85-5m8jw" podStartSLOduration=4.84686761 podStartE2EDuration="4.84686761s" podCreationTimestamp="2025-09-30 13:51:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:51:33.8207653 +0000 UTC m=+744.204767601" watchObservedRunningTime="2025-09-30 13:51:33.84686761 +0000 UTC m=+744.230869911" Sep 30 13:51:39 crc kubenswrapper[4936]: I0930 13:51:39.407960 4936 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 13:51:47 crc kubenswrapper[4936]: I0930 13:51:47.458505 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cq22t"] Sep 30 13:51:47 crc kubenswrapper[4936]: I0930 13:51:47.460381 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cq22t" Sep 30 13:51:47 crc kubenswrapper[4936]: I0930 13:51:47.475954 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cq22t"] Sep 30 13:51:47 crc kubenswrapper[4936]: I0930 13:51:47.545407 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1510484a-671c-47aa-bd53-d27a7eebc27c-catalog-content\") pod \"community-operators-cq22t\" (UID: \"1510484a-671c-47aa-bd53-d27a7eebc27c\") " pod="openshift-marketplace/community-operators-cq22t" Sep 30 13:51:47 crc kubenswrapper[4936]: I0930 13:51:47.545503 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r69qg\" (UniqueName: \"kubernetes.io/projected/1510484a-671c-47aa-bd53-d27a7eebc27c-kube-api-access-r69qg\") pod \"community-operators-cq22t\" (UID: \"1510484a-671c-47aa-bd53-d27a7eebc27c\") " pod="openshift-marketplace/community-operators-cq22t" Sep 30 13:51:47 crc kubenswrapper[4936]: I0930 13:51:47.545575 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1510484a-671c-47aa-bd53-d27a7eebc27c-utilities\") pod \"community-operators-cq22t\" (UID: \"1510484a-671c-47aa-bd53-d27a7eebc27c\") " pod="openshift-marketplace/community-operators-cq22t" Sep 30 13:51:47 crc kubenswrapper[4936]: I0930 13:51:47.646777 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1510484a-671c-47aa-bd53-d27a7eebc27c-utilities\") pod \"community-operators-cq22t\" (UID: \"1510484a-671c-47aa-bd53-d27a7eebc27c\") " pod="openshift-marketplace/community-operators-cq22t" Sep 30 13:51:47 crc kubenswrapper[4936]: I0930 13:51:47.646834 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1510484a-671c-47aa-bd53-d27a7eebc27c-catalog-content\") pod \"community-operators-cq22t\" (UID: \"1510484a-671c-47aa-bd53-d27a7eebc27c\") " pod="openshift-marketplace/community-operators-cq22t" Sep 30 13:51:47 crc kubenswrapper[4936]: I0930 13:51:47.646883 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r69qg\" (UniqueName: \"kubernetes.io/projected/1510484a-671c-47aa-bd53-d27a7eebc27c-kube-api-access-r69qg\") pod \"community-operators-cq22t\" (UID: \"1510484a-671c-47aa-bd53-d27a7eebc27c\") " pod="openshift-marketplace/community-operators-cq22t" Sep 30 13:51:47 crc kubenswrapper[4936]: I0930 13:51:47.647310 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1510484a-671c-47aa-bd53-d27a7eebc27c-utilities\") pod \"community-operators-cq22t\" (UID: \"1510484a-671c-47aa-bd53-d27a7eebc27c\") " pod="openshift-marketplace/community-operators-cq22t" Sep 30 13:51:47 crc kubenswrapper[4936]: I0930 13:51:47.647386 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1510484a-671c-47aa-bd53-d27a7eebc27c-catalog-content\") pod \"community-operators-cq22t\" (UID: \"1510484a-671c-47aa-bd53-d27a7eebc27c\") " pod="openshift-marketplace/community-operators-cq22t" Sep 30 13:51:47 crc kubenswrapper[4936]: I0930 13:51:47.667216 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r69qg\" (UniqueName: \"kubernetes.io/projected/1510484a-671c-47aa-bd53-d27a7eebc27c-kube-api-access-r69qg\") pod \"community-operators-cq22t\" (UID: \"1510484a-671c-47aa-bd53-d27a7eebc27c\") " pod="openshift-marketplace/community-operators-cq22t" Sep 30 13:51:47 crc kubenswrapper[4936]: I0930 13:51:47.783412 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cq22t" Sep 30 13:51:48 crc kubenswrapper[4936]: I0930 13:51:48.250191 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:51:48 crc kubenswrapper[4936]: I0930 13:51:48.250573 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:51:48 crc kubenswrapper[4936]: I0930 13:51:48.250618 4936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 13:51:48 crc kubenswrapper[4936]: I0930 13:51:48.252279 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b27a62cf82d437a70e61d77c0bf6775c7b99f0aab2b41f8875371a920ef34f1"} pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:51:48 crc kubenswrapper[4936]: I0930 13:51:48.252351 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" containerID="cri-o://7b27a62cf82d437a70e61d77c0bf6775c7b99f0aab2b41f8875371a920ef34f1" gracePeriod=600 Sep 30 13:51:48 crc kubenswrapper[4936]: I0930 13:51:48.377874 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cq22t"] Sep 30 13:51:48 crc kubenswrapper[4936]: E0930 13:51:48.776220 4936 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1510484a_671c_47aa_bd53_d27a7eebc27c.slice/crio-b75e224dc89ae3a713e87bad8ff56fd633b573fac9f4bc49db283637eb757361.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1510484a_671c_47aa_bd53_d27a7eebc27c.slice/crio-conmon-b75e224dc89ae3a713e87bad8ff56fd633b573fac9f4bc49db283637eb757361.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:51:48 crc kubenswrapper[4936]: I0930 13:51:48.845555 4936 generic.go:334] "Generic (PLEG): container finished" podID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerID="7b27a62cf82d437a70e61d77c0bf6775c7b99f0aab2b41f8875371a920ef34f1" exitCode=0 Sep 30 13:51:48 crc kubenswrapper[4936]: I0930 13:51:48.845625 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerDied","Data":"7b27a62cf82d437a70e61d77c0bf6775c7b99f0aab2b41f8875371a920ef34f1"} Sep 30 13:51:48 crc kubenswrapper[4936]: I0930 13:51:48.845657 4936 scope.go:117] "RemoveContainer" containerID="7a663dbb554115c6a5c0d9c45457fde22c2bc879b1b264710b17f1c647448671" Sep 30 13:51:48 crc kubenswrapper[4936]: I0930 13:51:48.847251 4936 generic.go:334] "Generic (PLEG): container finished" podID="1510484a-671c-47aa-bd53-d27a7eebc27c" containerID="b75e224dc89ae3a713e87bad8ff56fd633b573fac9f4bc49db283637eb757361" exitCode=0 Sep 30 13:51:48 crc kubenswrapper[4936]: I0930 13:51:48.847327 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cq22t" event={"ID":"1510484a-671c-47aa-bd53-d27a7eebc27c","Type":"ContainerDied","Data":"b75e224dc89ae3a713e87bad8ff56fd633b573fac9f4bc49db283637eb757361"} Sep 30 13:51:48 crc kubenswrapper[4936]: I0930 13:51:48.847456 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cq22t" event={"ID":"1510484a-671c-47aa-bd53-d27a7eebc27c","Type":"ContainerStarted","Data":"5a97a57cb1c52915dc325f899b5c6a4c8ed8e6308bebeede02af46cfbb39897c"} Sep 30 13:51:49 crc kubenswrapper[4936]: I0930 13:51:49.855992 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"48ed87deccef46c180b6a2bcdda86faafafe3195aa273e064e63d95d1f7429e4"} Sep 30 13:51:50 crc kubenswrapper[4936]: I0930 13:51:50.710986 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6fd76b6558-4gwpj" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.466919 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-j947b"] Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.469955 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.474052 4936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.474273 4936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-p2csh" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.478964 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.502803 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-chnlb"] Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.503562 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-chnlb" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.506731 4936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.551677 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-chnlb"] Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.590009 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-24z6s"] Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.590855 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-24z6s" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.593036 4936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-jrnzw" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.594189 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.594626 4936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.597141 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snfzg\" (UniqueName: \"kubernetes.io/projected/f304af2d-f6f0-4be9-8388-a81870af995f-kube-api-access-snfzg\") pod \"frr-k8s-webhook-server-5478bdb765-chnlb\" (UID: \"f304af2d-f6f0-4be9-8388-a81870af995f\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-chnlb" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.597310 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/55e7d0de-bf31-4644-958e-33d7fe7c696b-metrics\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.597434 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbmtn\" (UniqueName: \"kubernetes.io/projected/55e7d0de-bf31-4644-958e-33d7fe7c696b-kube-api-access-jbmtn\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.597513 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/55e7d0de-bf31-4644-958e-33d7fe7c696b-reloader\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.597591 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/55e7d0de-bf31-4644-958e-33d7fe7c696b-frr-sockets\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.597687 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f304af2d-f6f0-4be9-8388-a81870af995f-cert\") pod \"frr-k8s-webhook-server-5478bdb765-chnlb\" (UID: \"f304af2d-f6f0-4be9-8388-a81870af995f\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-chnlb" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.597765 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/55e7d0de-bf31-4644-958e-33d7fe7c696b-frr-conf\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.597913 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/55e7d0de-bf31-4644-958e-33d7fe7c696b-frr-startup\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.597960 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55e7d0de-bf31-4644-958e-33d7fe7c696b-metrics-certs\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.598105 4936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.625620 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-pkkdt"] Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.626694 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-pkkdt" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.628956 4936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.640975 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-pkkdt"] Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.698992 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/55e7d0de-bf31-4644-958e-33d7fe7c696b-metrics\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.699024 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbmtn\" (UniqueName: \"kubernetes.io/projected/55e7d0de-bf31-4644-958e-33d7fe7c696b-kube-api-access-jbmtn\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.699051 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/55e7d0de-bf31-4644-958e-33d7fe7c696b-reloader\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.699071 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/55e7d0de-bf31-4644-958e-33d7fe7c696b-frr-sockets\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.699104 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwjz6\" (UniqueName: \"kubernetes.io/projected/63b433c1-ca17-4e41-9412-8c9abede7b39-kube-api-access-dwjz6\") pod \"controller-5d688f5ffc-pkkdt\" (UID: \"63b433c1-ca17-4e41-9412-8c9abede7b39\") " pod="metallb-system/controller-5d688f5ffc-pkkdt" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.699128 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63b433c1-ca17-4e41-9412-8c9abede7b39-cert\") pod \"controller-5d688f5ffc-pkkdt\" (UID: \"63b433c1-ca17-4e41-9412-8c9abede7b39\") " pod="metallb-system/controller-5d688f5ffc-pkkdt" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.699158 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f304af2d-f6f0-4be9-8388-a81870af995f-cert\") pod \"frr-k8s-webhook-server-5478bdb765-chnlb\" (UID: \"f304af2d-f6f0-4be9-8388-a81870af995f\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-chnlb" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.699177 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/55e7d0de-bf31-4644-958e-33d7fe7c696b-frr-conf\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.699199 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx7f4\" (UniqueName: \"kubernetes.io/projected/f673a383-44a6-4fe9-a432-f84341817e89-kube-api-access-rx7f4\") pod \"speaker-24z6s\" (UID: \"f673a383-44a6-4fe9-a432-f84341817e89\") " pod="metallb-system/speaker-24z6s" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.699213 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f673a383-44a6-4fe9-a432-f84341817e89-memberlist\") pod \"speaker-24z6s\" (UID: \"f673a383-44a6-4fe9-a432-f84341817e89\") " pod="metallb-system/speaker-24z6s" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.699234 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/55e7d0de-bf31-4644-958e-33d7fe7c696b-frr-startup\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.699250 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f673a383-44a6-4fe9-a432-f84341817e89-metrics-certs\") pod \"speaker-24z6s\" (UID: \"f673a383-44a6-4fe9-a432-f84341817e89\") " pod="metallb-system/speaker-24z6s" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.699268 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55e7d0de-bf31-4644-958e-33d7fe7c696b-metrics-certs\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.699285 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63b433c1-ca17-4e41-9412-8c9abede7b39-metrics-certs\") pod \"controller-5d688f5ffc-pkkdt\" (UID: \"63b433c1-ca17-4e41-9412-8c9abede7b39\") " pod="metallb-system/controller-5d688f5ffc-pkkdt" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.699308 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snfzg\" (UniqueName: \"kubernetes.io/projected/f304af2d-f6f0-4be9-8388-a81870af995f-kube-api-access-snfzg\") pod \"frr-k8s-webhook-server-5478bdb765-chnlb\" (UID: \"f304af2d-f6f0-4be9-8388-a81870af995f\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-chnlb" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.699328 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f673a383-44a6-4fe9-a432-f84341817e89-metallb-excludel2\") pod \"speaker-24z6s\" (UID: \"f673a383-44a6-4fe9-a432-f84341817e89\") " pod="metallb-system/speaker-24z6s" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.699470 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/55e7d0de-bf31-4644-958e-33d7fe7c696b-metrics\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.699623 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/55e7d0de-bf31-4644-958e-33d7fe7c696b-reloader\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.699742 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/55e7d0de-bf31-4644-958e-33d7fe7c696b-frr-sockets\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.700053 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/55e7d0de-bf31-4644-958e-33d7fe7c696b-frr-conf\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.700313 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/55e7d0de-bf31-4644-958e-33d7fe7c696b-frr-startup\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.711408 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f304af2d-f6f0-4be9-8388-a81870af995f-cert\") pod \"frr-k8s-webhook-server-5478bdb765-chnlb\" (UID: \"f304af2d-f6f0-4be9-8388-a81870af995f\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-chnlb" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.719733 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55e7d0de-bf31-4644-958e-33d7fe7c696b-metrics-certs\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.722596 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snfzg\" (UniqueName: \"kubernetes.io/projected/f304af2d-f6f0-4be9-8388-a81870af995f-kube-api-access-snfzg\") pod \"frr-k8s-webhook-server-5478bdb765-chnlb\" (UID: \"f304af2d-f6f0-4be9-8388-a81870af995f\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-chnlb" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.723412 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbmtn\" (UniqueName: \"kubernetes.io/projected/55e7d0de-bf31-4644-958e-33d7fe7c696b-kube-api-access-jbmtn\") pod \"frr-k8s-j947b\" (UID: \"55e7d0de-bf31-4644-958e-33d7fe7c696b\") " pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.787209 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-j947b" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.800267 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63b433c1-ca17-4e41-9412-8c9abede7b39-cert\") pod \"controller-5d688f5ffc-pkkdt\" (UID: \"63b433c1-ca17-4e41-9412-8c9abede7b39\") " pod="metallb-system/controller-5d688f5ffc-pkkdt" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.800371 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx7f4\" (UniqueName: \"kubernetes.io/projected/f673a383-44a6-4fe9-a432-f84341817e89-kube-api-access-rx7f4\") pod \"speaker-24z6s\" (UID: \"f673a383-44a6-4fe9-a432-f84341817e89\") " pod="metallb-system/speaker-24z6s" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.800397 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f673a383-44a6-4fe9-a432-f84341817e89-memberlist\") pod \"speaker-24z6s\" (UID: \"f673a383-44a6-4fe9-a432-f84341817e89\") " pod="metallb-system/speaker-24z6s" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.800433 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f673a383-44a6-4fe9-a432-f84341817e89-metrics-certs\") pod \"speaker-24z6s\" (UID: \"f673a383-44a6-4fe9-a432-f84341817e89\") " pod="metallb-system/speaker-24z6s" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.800462 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63b433c1-ca17-4e41-9412-8c9abede7b39-metrics-certs\") pod \"controller-5d688f5ffc-pkkdt\" (UID: \"63b433c1-ca17-4e41-9412-8c9abede7b39\") " pod="metallb-system/controller-5d688f5ffc-pkkdt" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.800498 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f673a383-44a6-4fe9-a432-f84341817e89-metallb-excludel2\") pod \"speaker-24z6s\" (UID: \"f673a383-44a6-4fe9-a432-f84341817e89\") " pod="metallb-system/speaker-24z6s" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.800535 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwjz6\" (UniqueName: \"kubernetes.io/projected/63b433c1-ca17-4e41-9412-8c9abede7b39-kube-api-access-dwjz6\") pod \"controller-5d688f5ffc-pkkdt\" (UID: \"63b433c1-ca17-4e41-9412-8c9abede7b39\") " pod="metallb-system/controller-5d688f5ffc-pkkdt" Sep 30 13:51:51 crc kubenswrapper[4936]: E0930 13:51:51.801634 4936 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Sep 30 13:51:51 crc kubenswrapper[4936]: E0930 13:51:51.801698 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f673a383-44a6-4fe9-a432-f84341817e89-metrics-certs podName:f673a383-44a6-4fe9-a432-f84341817e89 nodeName:}" failed. No retries permitted until 2025-09-30 13:51:52.301677952 +0000 UTC m=+762.685680253 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f673a383-44a6-4fe9-a432-f84341817e89-metrics-certs") pod "speaker-24z6s" (UID: "f673a383-44a6-4fe9-a432-f84341817e89") : secret "speaker-certs-secret" not found Sep 30 13:51:51 crc kubenswrapper[4936]: E0930 13:51:51.801991 4936 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 13:51:51 crc kubenswrapper[4936]: E0930 13:51:51.802022 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f673a383-44a6-4fe9-a432-f84341817e89-memberlist podName:f673a383-44a6-4fe9-a432-f84341817e89 nodeName:}" failed. No retries permitted until 2025-09-30 13:51:52.302011671 +0000 UTC m=+762.686013972 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f673a383-44a6-4fe9-a432-f84341817e89-memberlist") pod "speaker-24z6s" (UID: "f673a383-44a6-4fe9-a432-f84341817e89") : secret "metallb-memberlist" not found Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.802904 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f673a383-44a6-4fe9-a432-f84341817e89-metallb-excludel2\") pod \"speaker-24z6s\" (UID: \"f673a383-44a6-4fe9-a432-f84341817e89\") " pod="metallb-system/speaker-24z6s" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.805994 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63b433c1-ca17-4e41-9412-8c9abede7b39-metrics-certs\") pod \"controller-5d688f5ffc-pkkdt\" (UID: \"63b433c1-ca17-4e41-9412-8c9abede7b39\") " pod="metallb-system/controller-5d688f5ffc-pkkdt" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.807737 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63b433c1-ca17-4e41-9412-8c9abede7b39-cert\") pod \"controller-5d688f5ffc-pkkdt\" (UID: \"63b433c1-ca17-4e41-9412-8c9abede7b39\") " pod="metallb-system/controller-5d688f5ffc-pkkdt" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.817091 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-chnlb" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.822878 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx7f4\" (UniqueName: \"kubernetes.io/projected/f673a383-44a6-4fe9-a432-f84341817e89-kube-api-access-rx7f4\") pod \"speaker-24z6s\" (UID: \"f673a383-44a6-4fe9-a432-f84341817e89\") " pod="metallb-system/speaker-24z6s" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.834134 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwjz6\" (UniqueName: \"kubernetes.io/projected/63b433c1-ca17-4e41-9412-8c9abede7b39-kube-api-access-dwjz6\") pod \"controller-5d688f5ffc-pkkdt\" (UID: \"63b433c1-ca17-4e41-9412-8c9abede7b39\") " pod="metallb-system/controller-5d688f5ffc-pkkdt" Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.880010 4936 generic.go:334] "Generic (PLEG): container finished" podID="1510484a-671c-47aa-bd53-d27a7eebc27c" containerID="f8fbffc64f80ec9c8d3322ce2c29342eb484ebe6fb2bbecc638b21d31903e369" exitCode=0 Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.880052 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cq22t" event={"ID":"1510484a-671c-47aa-bd53-d27a7eebc27c","Type":"ContainerDied","Data":"f8fbffc64f80ec9c8d3322ce2c29342eb484ebe6fb2bbecc638b21d31903e369"} Sep 30 13:51:51 crc kubenswrapper[4936]: I0930 13:51:51.941153 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-pkkdt" Sep 30 13:51:52 crc kubenswrapper[4936]: I0930 13:51:52.290850 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-chnlb"] Sep 30 13:51:52 crc kubenswrapper[4936]: W0930 13:51:52.296096 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf304af2d_f6f0_4be9_8388_a81870af995f.slice/crio-156fe30341336c31544c2f85a198e8e616d20e8001cab6a4cee30cd751c4aa2a WatchSource:0}: Error finding container 156fe30341336c31544c2f85a198e8e616d20e8001cab6a4cee30cd751c4aa2a: Status 404 returned error can't find the container with id 156fe30341336c31544c2f85a198e8e616d20e8001cab6a4cee30cd751c4aa2a Sep 30 13:51:52 crc kubenswrapper[4936]: I0930 13:51:52.309579 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f673a383-44a6-4fe9-a432-f84341817e89-memberlist\") pod \"speaker-24z6s\" (UID: \"f673a383-44a6-4fe9-a432-f84341817e89\") " pod="metallb-system/speaker-24z6s" Sep 30 13:51:52 crc kubenswrapper[4936]: I0930 13:51:52.309623 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f673a383-44a6-4fe9-a432-f84341817e89-metrics-certs\") pod \"speaker-24z6s\" (UID: \"f673a383-44a6-4fe9-a432-f84341817e89\") " pod="metallb-system/speaker-24z6s" Sep 30 13:51:52 crc kubenswrapper[4936]: E0930 13:51:52.310167 4936 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 13:51:52 crc kubenswrapper[4936]: E0930 13:51:52.310224 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f673a383-44a6-4fe9-a432-f84341817e89-memberlist podName:f673a383-44a6-4fe9-a432-f84341817e89 nodeName:}" failed. No retries permitted until 2025-09-30 13:51:53.310205645 +0000 UTC m=+763.694207946 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f673a383-44a6-4fe9-a432-f84341817e89-memberlist") pod "speaker-24z6s" (UID: "f673a383-44a6-4fe9-a432-f84341817e89") : secret "metallb-memberlist" not found Sep 30 13:51:52 crc kubenswrapper[4936]: I0930 13:51:52.315890 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f673a383-44a6-4fe9-a432-f84341817e89-metrics-certs\") pod \"speaker-24z6s\" (UID: \"f673a383-44a6-4fe9-a432-f84341817e89\") " pod="metallb-system/speaker-24z6s" Sep 30 13:51:52 crc kubenswrapper[4936]: I0930 13:51:52.432981 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-pkkdt"] Sep 30 13:51:52 crc kubenswrapper[4936]: W0930 13:51:52.433728 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63b433c1_ca17_4e41_9412_8c9abede7b39.slice/crio-589169dbbc2bd2666261df2b3b7207a6797ad8e06cdd3e22232a8f839d8ab071 WatchSource:0}: Error finding container 589169dbbc2bd2666261df2b3b7207a6797ad8e06cdd3e22232a8f839d8ab071: Status 404 returned error can't find the container with id 589169dbbc2bd2666261df2b3b7207a6797ad8e06cdd3e22232a8f839d8ab071 Sep 30 13:51:52 crc kubenswrapper[4936]: I0930 13:51:52.885286 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j947b" event={"ID":"55e7d0de-bf31-4644-958e-33d7fe7c696b","Type":"ContainerStarted","Data":"d2ad8c0259721d317428f3cb08672b3468972c6b9857424c00a8303036b2bfee"} Sep 30 13:51:52 crc kubenswrapper[4936]: I0930 13:51:52.886571 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-pkkdt" event={"ID":"63b433c1-ca17-4e41-9412-8c9abede7b39","Type":"ContainerStarted","Data":"121a71ba58d89885796b89eca1f6a1e4ed5dc947823a9dfc6887cbe6f089b098"} Sep 30 13:51:52 crc kubenswrapper[4936]: I0930 13:51:52.886629 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-pkkdt" event={"ID":"63b433c1-ca17-4e41-9412-8c9abede7b39","Type":"ContainerStarted","Data":"589169dbbc2bd2666261df2b3b7207a6797ad8e06cdd3e22232a8f839d8ab071"} Sep 30 13:51:52 crc kubenswrapper[4936]: I0930 13:51:52.887402 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-chnlb" event={"ID":"f304af2d-f6f0-4be9-8388-a81870af995f","Type":"ContainerStarted","Data":"156fe30341336c31544c2f85a198e8e616d20e8001cab6a4cee30cd751c4aa2a"} Sep 30 13:51:53 crc kubenswrapper[4936]: I0930 13:51:53.329128 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f673a383-44a6-4fe9-a432-f84341817e89-memberlist\") pod \"speaker-24z6s\" (UID: \"f673a383-44a6-4fe9-a432-f84341817e89\") " pod="metallb-system/speaker-24z6s" Sep 30 13:51:53 crc kubenswrapper[4936]: I0930 13:51:53.337904 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f673a383-44a6-4fe9-a432-f84341817e89-memberlist\") pod \"speaker-24z6s\" (UID: \"f673a383-44a6-4fe9-a432-f84341817e89\") " pod="metallb-system/speaker-24z6s" Sep 30 13:51:53 crc kubenswrapper[4936]: I0930 13:51:53.403941 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-24z6s" Sep 30 13:51:53 crc kubenswrapper[4936]: I0930 13:51:53.903902 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cq22t" event={"ID":"1510484a-671c-47aa-bd53-d27a7eebc27c","Type":"ContainerStarted","Data":"905c245fa5da32f0b519c613ebb77b927d27f0bda29410183190343c946cc221"} Sep 30 13:51:53 crc kubenswrapper[4936]: I0930 13:51:53.906283 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-24z6s" event={"ID":"f673a383-44a6-4fe9-a432-f84341817e89","Type":"ContainerStarted","Data":"e4d95174c4726c8ffe9c841c7eafd3732145119bcda15910253763e403765c7b"} Sep 30 13:51:53 crc kubenswrapper[4936]: I0930 13:51:53.906309 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-24z6s" event={"ID":"f673a383-44a6-4fe9-a432-f84341817e89","Type":"ContainerStarted","Data":"131f4e09015e0facee751ea6b5418c929f7608c7a0e2b2221b1953e024cd1d8a"} Sep 30 13:51:53 crc kubenswrapper[4936]: I0930 13:51:53.915650 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-pkkdt" event={"ID":"63b433c1-ca17-4e41-9412-8c9abede7b39","Type":"ContainerStarted","Data":"026b5da42724975a85f689f4ca7dc601039f505fafa9bf0fe0d0f403d9d54e81"} Sep 30 13:51:53 crc kubenswrapper[4936]: I0930 13:51:53.916268 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-pkkdt" Sep 30 13:51:53 crc kubenswrapper[4936]: I0930 13:51:53.948749 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cq22t" podStartSLOduration=2.851359026 podStartE2EDuration="6.948730477s" podCreationTimestamp="2025-09-30 13:51:47 +0000 UTC" firstStartedPulling="2025-09-30 13:51:48.848997437 +0000 UTC m=+759.232999738" lastFinishedPulling="2025-09-30 13:51:52.946368888 +0000 UTC m=+763.330371189" observedRunningTime="2025-09-30 13:51:53.943731965 +0000 UTC m=+764.327734276" watchObservedRunningTime="2025-09-30 13:51:53.948730477 +0000 UTC m=+764.332732778" Sep 30 13:51:54 crc kubenswrapper[4936]: I0930 13:51:54.940735 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-24z6s" event={"ID":"f673a383-44a6-4fe9-a432-f84341817e89","Type":"ContainerStarted","Data":"bf12c55062afce8a3ba071bbcf8f88bd55551490e49103d61f77050404344f7b"} Sep 30 13:51:54 crc kubenswrapper[4936]: I0930 13:51:54.988916 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-24z6s" podStartSLOduration=3.988901429 podStartE2EDuration="3.988901429s" podCreationTimestamp="2025-09-30 13:51:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:51:54.987926921 +0000 UTC m=+765.371929222" watchObservedRunningTime="2025-09-30 13:51:54.988901429 +0000 UTC m=+765.372903730" Sep 30 13:51:54 crc kubenswrapper[4936]: I0930 13:51:54.990324 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-pkkdt" podStartSLOduration=3.990318079 podStartE2EDuration="3.990318079s" podCreationTimestamp="2025-09-30 13:51:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:51:54.009433979 +0000 UTC m=+764.393436290" watchObservedRunningTime="2025-09-30 13:51:54.990318079 +0000 UTC m=+765.374320380" Sep 30 13:51:55 crc kubenswrapper[4936]: I0930 13:51:55.944915 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-24z6s" Sep 30 13:51:57 crc kubenswrapper[4936]: I0930 13:51:57.784029 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cq22t" Sep 30 13:51:57 crc kubenswrapper[4936]: I0930 13:51:57.784294 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cq22t" Sep 30 13:51:57 crc kubenswrapper[4936]: I0930 13:51:57.819609 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cq22t" Sep 30 13:51:58 crc kubenswrapper[4936]: I0930 13:51:58.998219 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cq22t" Sep 30 13:51:59 crc kubenswrapper[4936]: I0930 13:51:59.037751 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cq22t"] Sep 30 13:52:00 crc kubenswrapper[4936]: I0930 13:52:00.974447 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cq22t" podUID="1510484a-671c-47aa-bd53-d27a7eebc27c" containerName="registry-server" containerID="cri-o://905c245fa5da32f0b519c613ebb77b927d27f0bda29410183190343c946cc221" gracePeriod=2 Sep 30 13:52:01 crc kubenswrapper[4936]: I0930 13:52:01.492760 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cq22t" Sep 30 13:52:01 crc kubenswrapper[4936]: I0930 13:52:01.547609 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1510484a-671c-47aa-bd53-d27a7eebc27c-catalog-content\") pod \"1510484a-671c-47aa-bd53-d27a7eebc27c\" (UID: \"1510484a-671c-47aa-bd53-d27a7eebc27c\") " Sep 30 13:52:01 crc kubenswrapper[4936]: I0930 13:52:01.547662 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1510484a-671c-47aa-bd53-d27a7eebc27c-utilities\") pod \"1510484a-671c-47aa-bd53-d27a7eebc27c\" (UID: \"1510484a-671c-47aa-bd53-d27a7eebc27c\") " Sep 30 13:52:01 crc kubenswrapper[4936]: I0930 13:52:01.547727 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r69qg\" (UniqueName: \"kubernetes.io/projected/1510484a-671c-47aa-bd53-d27a7eebc27c-kube-api-access-r69qg\") pod \"1510484a-671c-47aa-bd53-d27a7eebc27c\" (UID: \"1510484a-671c-47aa-bd53-d27a7eebc27c\") " Sep 30 13:52:01 crc kubenswrapper[4936]: I0930 13:52:01.549376 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1510484a-671c-47aa-bd53-d27a7eebc27c-utilities" (OuterVolumeSpecName: "utilities") pod "1510484a-671c-47aa-bd53-d27a7eebc27c" (UID: "1510484a-671c-47aa-bd53-d27a7eebc27c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:52:01 crc kubenswrapper[4936]: I0930 13:52:01.555076 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1510484a-671c-47aa-bd53-d27a7eebc27c-kube-api-access-r69qg" (OuterVolumeSpecName: "kube-api-access-r69qg") pod "1510484a-671c-47aa-bd53-d27a7eebc27c" (UID: "1510484a-671c-47aa-bd53-d27a7eebc27c"). InnerVolumeSpecName "kube-api-access-r69qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:52:01 crc kubenswrapper[4936]: I0930 13:52:01.596764 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1510484a-671c-47aa-bd53-d27a7eebc27c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1510484a-671c-47aa-bd53-d27a7eebc27c" (UID: "1510484a-671c-47aa-bd53-d27a7eebc27c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:52:01 crc kubenswrapper[4936]: I0930 13:52:01.649303 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1510484a-671c-47aa-bd53-d27a7eebc27c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:52:01 crc kubenswrapper[4936]: I0930 13:52:01.649353 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1510484a-671c-47aa-bd53-d27a7eebc27c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:52:01 crc kubenswrapper[4936]: I0930 13:52:01.649366 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r69qg\" (UniqueName: \"kubernetes.io/projected/1510484a-671c-47aa-bd53-d27a7eebc27c-kube-api-access-r69qg\") on node \"crc\" DevicePath \"\"" Sep 30 13:52:01 crc kubenswrapper[4936]: I0930 13:52:01.981168 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-chnlb" event={"ID":"f304af2d-f6f0-4be9-8388-a81870af995f","Type":"ContainerStarted","Data":"7b7bed6190fe7466b223319e13ba13bfe201177b6379347d010fea8e85d4c7e8"} Sep 30 13:52:01 crc kubenswrapper[4936]: I0930 13:52:01.981345 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-chnlb" Sep 30 13:52:01 crc kubenswrapper[4936]: I0930 13:52:01.982725 4936 generic.go:334] "Generic (PLEG): container finished" podID="55e7d0de-bf31-4644-958e-33d7fe7c696b" containerID="4208431fbc6866f6dba4321d767414c7834405125372907f73acb0b7b4863cdd" exitCode=0 Sep 30 13:52:01 crc kubenswrapper[4936]: I0930 13:52:01.983659 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j947b" event={"ID":"55e7d0de-bf31-4644-958e-33d7fe7c696b","Type":"ContainerDied","Data":"4208431fbc6866f6dba4321d767414c7834405125372907f73acb0b7b4863cdd"} Sep 30 13:52:01 crc kubenswrapper[4936]: I0930 13:52:01.986011 4936 generic.go:334] "Generic (PLEG): container finished" podID="1510484a-671c-47aa-bd53-d27a7eebc27c" containerID="905c245fa5da32f0b519c613ebb77b927d27f0bda29410183190343c946cc221" exitCode=0 Sep 30 13:52:01 crc kubenswrapper[4936]: I0930 13:52:01.986039 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cq22t" event={"ID":"1510484a-671c-47aa-bd53-d27a7eebc27c","Type":"ContainerDied","Data":"905c245fa5da32f0b519c613ebb77b927d27f0bda29410183190343c946cc221"} Sep 30 13:52:01 crc kubenswrapper[4936]: I0930 13:52:01.986056 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cq22t" event={"ID":"1510484a-671c-47aa-bd53-d27a7eebc27c","Type":"ContainerDied","Data":"5a97a57cb1c52915dc325f899b5c6a4c8ed8e6308bebeede02af46cfbb39897c"} Sep 30 13:52:01 crc kubenswrapper[4936]: I0930 13:52:01.986071 4936 scope.go:117] "RemoveContainer" containerID="905c245fa5da32f0b519c613ebb77b927d27f0bda29410183190343c946cc221" Sep 30 13:52:01 crc kubenswrapper[4936]: I0930 13:52:01.986166 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cq22t" Sep 30 13:52:01 crc kubenswrapper[4936]: I0930 13:52:01.998206 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-chnlb" podStartSLOduration=1.561075431 podStartE2EDuration="10.99819085s" podCreationTimestamp="2025-09-30 13:51:51 +0000 UTC" firstStartedPulling="2025-09-30 13:51:52.298457872 +0000 UTC m=+762.682460173" lastFinishedPulling="2025-09-30 13:52:01.735573291 +0000 UTC m=+772.119575592" observedRunningTime="2025-09-30 13:52:01.995848263 +0000 UTC m=+772.379850564" watchObservedRunningTime="2025-09-30 13:52:01.99819085 +0000 UTC m=+772.382193151" Sep 30 13:52:02 crc kubenswrapper[4936]: I0930 13:52:02.011327 4936 scope.go:117] "RemoveContainer" containerID="f8fbffc64f80ec9c8d3322ce2c29342eb484ebe6fb2bbecc638b21d31903e369" Sep 30 13:52:02 crc kubenswrapper[4936]: I0930 13:52:02.040532 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cq22t"] Sep 30 13:52:02 crc kubenswrapper[4936]: I0930 13:52:02.049751 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cq22t"] Sep 30 13:52:02 crc kubenswrapper[4936]: I0930 13:52:02.059320 4936 scope.go:117] "RemoveContainer" containerID="b75e224dc89ae3a713e87bad8ff56fd633b573fac9f4bc49db283637eb757361" Sep 30 13:52:02 crc kubenswrapper[4936]: I0930 13:52:02.074790 4936 scope.go:117] "RemoveContainer" containerID="905c245fa5da32f0b519c613ebb77b927d27f0bda29410183190343c946cc221" Sep 30 13:52:02 crc kubenswrapper[4936]: E0930 13:52:02.075172 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"905c245fa5da32f0b519c613ebb77b927d27f0bda29410183190343c946cc221\": container with ID starting with 905c245fa5da32f0b519c613ebb77b927d27f0bda29410183190343c946cc221 not found: ID does not exist" containerID="905c245fa5da32f0b519c613ebb77b927d27f0bda29410183190343c946cc221" Sep 30 13:52:02 crc kubenswrapper[4936]: I0930 13:52:02.075220 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"905c245fa5da32f0b519c613ebb77b927d27f0bda29410183190343c946cc221"} err="failed to get container status \"905c245fa5da32f0b519c613ebb77b927d27f0bda29410183190343c946cc221\": rpc error: code = NotFound desc = could not find container \"905c245fa5da32f0b519c613ebb77b927d27f0bda29410183190343c946cc221\": container with ID starting with 905c245fa5da32f0b519c613ebb77b927d27f0bda29410183190343c946cc221 not found: ID does not exist" Sep 30 13:52:02 crc kubenswrapper[4936]: I0930 13:52:02.075251 4936 scope.go:117] "RemoveContainer" containerID="f8fbffc64f80ec9c8d3322ce2c29342eb484ebe6fb2bbecc638b21d31903e369" Sep 30 13:52:02 crc kubenswrapper[4936]: E0930 13:52:02.075773 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8fbffc64f80ec9c8d3322ce2c29342eb484ebe6fb2bbecc638b21d31903e369\": container with ID starting with f8fbffc64f80ec9c8d3322ce2c29342eb484ebe6fb2bbecc638b21d31903e369 not found: ID does not exist" containerID="f8fbffc64f80ec9c8d3322ce2c29342eb484ebe6fb2bbecc638b21d31903e369" Sep 30 13:52:02 crc kubenswrapper[4936]: I0930 13:52:02.075806 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8fbffc64f80ec9c8d3322ce2c29342eb484ebe6fb2bbecc638b21d31903e369"} err="failed to get container status \"f8fbffc64f80ec9c8d3322ce2c29342eb484ebe6fb2bbecc638b21d31903e369\": rpc error: code = NotFound desc = could not find container \"f8fbffc64f80ec9c8d3322ce2c29342eb484ebe6fb2bbecc638b21d31903e369\": container with ID starting with f8fbffc64f80ec9c8d3322ce2c29342eb484ebe6fb2bbecc638b21d31903e369 not found: ID does not exist" Sep 30 13:52:02 crc kubenswrapper[4936]: I0930 13:52:02.075828 4936 scope.go:117] "RemoveContainer" containerID="b75e224dc89ae3a713e87bad8ff56fd633b573fac9f4bc49db283637eb757361" Sep 30 13:52:02 crc kubenswrapper[4936]: E0930 13:52:02.076469 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b75e224dc89ae3a713e87bad8ff56fd633b573fac9f4bc49db283637eb757361\": container with ID starting with b75e224dc89ae3a713e87bad8ff56fd633b573fac9f4bc49db283637eb757361 not found: ID does not exist" containerID="b75e224dc89ae3a713e87bad8ff56fd633b573fac9f4bc49db283637eb757361" Sep 30 13:52:02 crc kubenswrapper[4936]: I0930 13:52:02.076527 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b75e224dc89ae3a713e87bad8ff56fd633b573fac9f4bc49db283637eb757361"} err="failed to get container status \"b75e224dc89ae3a713e87bad8ff56fd633b573fac9f4bc49db283637eb757361\": rpc error: code = NotFound desc = could not find container \"b75e224dc89ae3a713e87bad8ff56fd633b573fac9f4bc49db283637eb757361\": container with ID starting with b75e224dc89ae3a713e87bad8ff56fd633b573fac9f4bc49db283637eb757361 not found: ID does not exist" Sep 30 13:52:02 crc kubenswrapper[4936]: I0930 13:52:02.322327 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1510484a-671c-47aa-bd53-d27a7eebc27c" path="/var/lib/kubelet/pods/1510484a-671c-47aa-bd53-d27a7eebc27c/volumes" Sep 30 13:52:02 crc kubenswrapper[4936]: I0930 13:52:02.992248 4936 generic.go:334] "Generic (PLEG): container finished" podID="55e7d0de-bf31-4644-958e-33d7fe7c696b" containerID="f24bd805279257fe7d63651c2bb0a64ba1743adf7938b324996bca001e751f60" exitCode=0 Sep 30 13:52:02 crc kubenswrapper[4936]: I0930 13:52:02.992306 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j947b" event={"ID":"55e7d0de-bf31-4644-958e-33d7fe7c696b","Type":"ContainerDied","Data":"f24bd805279257fe7d63651c2bb0a64ba1743adf7938b324996bca001e751f60"} Sep 30 13:52:03 crc kubenswrapper[4936]: I0930 13:52:03.408347 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-24z6s" Sep 30 13:52:04 crc kubenswrapper[4936]: I0930 13:52:04.002423 4936 generic.go:334] "Generic (PLEG): container finished" podID="55e7d0de-bf31-4644-958e-33d7fe7c696b" containerID="aa0fc9aaa7bda4b3263399890c1b65fca0c01ef88ea4782d5b9573a8a5360322" exitCode=0 Sep 30 13:52:04 crc kubenswrapper[4936]: I0930 13:52:04.002489 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j947b" event={"ID":"55e7d0de-bf31-4644-958e-33d7fe7c696b","Type":"ContainerDied","Data":"aa0fc9aaa7bda4b3263399890c1b65fca0c01ef88ea4782d5b9573a8a5360322"} Sep 30 13:52:05 crc kubenswrapper[4936]: I0930 13:52:05.049308 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j947b" event={"ID":"55e7d0de-bf31-4644-958e-33d7fe7c696b","Type":"ContainerStarted","Data":"24acf27d07f594c464bd70a2bee5d0930856c75cada54fde010ef6a62699c620"} Sep 30 13:52:05 crc kubenswrapper[4936]: I0930 13:52:05.049635 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j947b" event={"ID":"55e7d0de-bf31-4644-958e-33d7fe7c696b","Type":"ContainerStarted","Data":"3029e14e1ebfbc6b91b2289aaf82e4b2ded0e8ebe1673dbbfa4fbbb4c8467601"} Sep 30 13:52:05 crc kubenswrapper[4936]: I0930 13:52:05.049645 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j947b" event={"ID":"55e7d0de-bf31-4644-958e-33d7fe7c696b","Type":"ContainerStarted","Data":"1fc9f288c9f28d767740da22569ca456c5644c545d1f7b6680b0690e77220f5b"} Sep 30 13:52:05 crc kubenswrapper[4936]: I0930 13:52:05.049653 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j947b" event={"ID":"55e7d0de-bf31-4644-958e-33d7fe7c696b","Type":"ContainerStarted","Data":"7046c76c67116b74ac90afd7cdcf8fc75002bb2863ffb14f0b26eee5ae580ce7"} Sep 30 13:52:05 crc kubenswrapper[4936]: I0930 13:52:05.049668 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j947b" event={"ID":"55e7d0de-bf31-4644-958e-33d7fe7c696b","Type":"ContainerStarted","Data":"9efd0dfe658c31d53e1c3a437d2373903f04820e48833ef7ca55471797684f81"} Sep 30 13:52:06 crc kubenswrapper[4936]: I0930 13:52:06.057400 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j947b" event={"ID":"55e7d0de-bf31-4644-958e-33d7fe7c696b","Type":"ContainerStarted","Data":"c81eb9668d269915108a487bfe5aa5a0e8ff9b12310c42c45f392ca52022c089"} Sep 30 13:52:06 crc kubenswrapper[4936]: I0930 13:52:06.058389 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-j947b" Sep 30 13:52:06 crc kubenswrapper[4936]: I0930 13:52:06.341658 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-j947b" podStartSLOduration=5.625139588 podStartE2EDuration="15.341633311s" podCreationTimestamp="2025-09-30 13:51:51 +0000 UTC" firstStartedPulling="2025-09-30 13:51:52.048829382 +0000 UTC m=+762.432831683" lastFinishedPulling="2025-09-30 13:52:01.765323105 +0000 UTC m=+772.149325406" observedRunningTime="2025-09-30 13:52:06.083640143 +0000 UTC m=+776.467642444" watchObservedRunningTime="2025-09-30 13:52:06.341633311 +0000 UTC m=+776.725635612" Sep 30 13:52:06 crc kubenswrapper[4936]: I0930 13:52:06.344843 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-6nl8s"] Sep 30 13:52:06 crc kubenswrapper[4936]: E0930 13:52:06.345212 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1510484a-671c-47aa-bd53-d27a7eebc27c" containerName="extract-content" Sep 30 13:52:06 crc kubenswrapper[4936]: I0930 13:52:06.345310 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="1510484a-671c-47aa-bd53-d27a7eebc27c" containerName="extract-content" Sep 30 13:52:06 crc kubenswrapper[4936]: E0930 13:52:06.345403 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1510484a-671c-47aa-bd53-d27a7eebc27c" containerName="registry-server" Sep 30 13:52:06 crc kubenswrapper[4936]: I0930 13:52:06.345460 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="1510484a-671c-47aa-bd53-d27a7eebc27c" containerName="registry-server" Sep 30 13:52:06 crc kubenswrapper[4936]: E0930 13:52:06.345531 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1510484a-671c-47aa-bd53-d27a7eebc27c" containerName="extract-utilities" Sep 30 13:52:06 crc kubenswrapper[4936]: I0930 13:52:06.345608 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="1510484a-671c-47aa-bd53-d27a7eebc27c" containerName="extract-utilities" Sep 30 13:52:06 crc kubenswrapper[4936]: I0930 13:52:06.345802 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="1510484a-671c-47aa-bd53-d27a7eebc27c" containerName="registry-server" Sep 30 13:52:06 crc kubenswrapper[4936]: I0930 13:52:06.346369 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6nl8s" Sep 30 13:52:06 crc kubenswrapper[4936]: I0930 13:52:06.348989 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Sep 30 13:52:06 crc kubenswrapper[4936]: I0930 13:52:06.349187 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-cw9k9" Sep 30 13:52:06 crc kubenswrapper[4936]: I0930 13:52:06.355374 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Sep 30 13:52:06 crc kubenswrapper[4936]: I0930 13:52:06.361583 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6nl8s"] Sep 30 13:52:06 crc kubenswrapper[4936]: I0930 13:52:06.407556 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsmr8\" (UniqueName: \"kubernetes.io/projected/1b776e9f-dda7-4637-b414-c3a52784dfc0-kube-api-access-xsmr8\") pod \"openstack-operator-index-6nl8s\" (UID: \"1b776e9f-dda7-4637-b414-c3a52784dfc0\") " pod="openstack-operators/openstack-operator-index-6nl8s" Sep 30 13:52:06 crc kubenswrapper[4936]: I0930 13:52:06.508791 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsmr8\" (UniqueName: \"kubernetes.io/projected/1b776e9f-dda7-4637-b414-c3a52784dfc0-kube-api-access-xsmr8\") pod \"openstack-operator-index-6nl8s\" (UID: \"1b776e9f-dda7-4637-b414-c3a52784dfc0\") " pod="openstack-operators/openstack-operator-index-6nl8s" Sep 30 13:52:06 crc kubenswrapper[4936]: I0930 13:52:06.526671 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsmr8\" (UniqueName: \"kubernetes.io/projected/1b776e9f-dda7-4637-b414-c3a52784dfc0-kube-api-access-xsmr8\") pod \"openstack-operator-index-6nl8s\" (UID: \"1b776e9f-dda7-4637-b414-c3a52784dfc0\") " pod="openstack-operators/openstack-operator-index-6nl8s" Sep 30 13:52:06 crc kubenswrapper[4936]: I0930 13:52:06.665040 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6nl8s" Sep 30 13:52:06 crc kubenswrapper[4936]: I0930 13:52:06.787869 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-j947b" Sep 30 13:52:06 crc kubenswrapper[4936]: I0930 13:52:06.833508 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-j947b" Sep 30 13:52:07 crc kubenswrapper[4936]: I0930 13:52:07.062460 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6nl8s"] Sep 30 13:52:07 crc kubenswrapper[4936]: W0930 13:52:07.086467 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b776e9f_dda7_4637_b414_c3a52784dfc0.slice/crio-5355edd7a10afb25134cc168736b7cc78b7110e91005818ee7f59ccab93e43d8 WatchSource:0}: Error finding container 5355edd7a10afb25134cc168736b7cc78b7110e91005818ee7f59ccab93e43d8: Status 404 returned error can't find the container with id 5355edd7a10afb25134cc168736b7cc78b7110e91005818ee7f59ccab93e43d8 Sep 30 13:52:08 crc kubenswrapper[4936]: I0930 13:52:08.069804 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6nl8s" event={"ID":"1b776e9f-dda7-4637-b414-c3a52784dfc0","Type":"ContainerStarted","Data":"5355edd7a10afb25134cc168736b7cc78b7110e91005818ee7f59ccab93e43d8"} Sep 30 13:52:09 crc kubenswrapper[4936]: I0930 13:52:09.122442 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-6nl8s"] Sep 30 13:52:09 crc kubenswrapper[4936]: I0930 13:52:09.736082 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-b5lzr"] Sep 30 13:52:09 crc kubenswrapper[4936]: I0930 13:52:09.737051 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-b5lzr" Sep 30 13:52:09 crc kubenswrapper[4936]: I0930 13:52:09.738851 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-b5lzr"] Sep 30 13:52:09 crc kubenswrapper[4936]: I0930 13:52:09.854599 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fz2w\" (UniqueName: \"kubernetes.io/projected/6fc87ca3-ce0e-4976-b45f-cf28709a6f9f-kube-api-access-9fz2w\") pod \"openstack-operator-index-b5lzr\" (UID: \"6fc87ca3-ce0e-4976-b45f-cf28709a6f9f\") " pod="openstack-operators/openstack-operator-index-b5lzr" Sep 30 13:52:09 crc kubenswrapper[4936]: I0930 13:52:09.956243 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fz2w\" (UniqueName: \"kubernetes.io/projected/6fc87ca3-ce0e-4976-b45f-cf28709a6f9f-kube-api-access-9fz2w\") pod \"openstack-operator-index-b5lzr\" (UID: \"6fc87ca3-ce0e-4976-b45f-cf28709a6f9f\") " pod="openstack-operators/openstack-operator-index-b5lzr" Sep 30 13:52:09 crc kubenswrapper[4936]: I0930 13:52:09.992969 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fz2w\" (UniqueName: \"kubernetes.io/projected/6fc87ca3-ce0e-4976-b45f-cf28709a6f9f-kube-api-access-9fz2w\") pod \"openstack-operator-index-b5lzr\" (UID: \"6fc87ca3-ce0e-4976-b45f-cf28709a6f9f\") " pod="openstack-operators/openstack-operator-index-b5lzr" Sep 30 13:52:10 crc kubenswrapper[4936]: I0930 13:52:10.061361 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-b5lzr" Sep 30 13:52:10 crc kubenswrapper[4936]: I0930 13:52:10.763882 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-b5lzr"] Sep 30 13:52:11 crc kubenswrapper[4936]: I0930 13:52:11.092198 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-b5lzr" event={"ID":"6fc87ca3-ce0e-4976-b45f-cf28709a6f9f","Type":"ContainerStarted","Data":"a82e21108f0e63b7b72b1901ad02c346d823615fcfa5074b0f5a4e40e0860ebf"} Sep 30 13:52:11 crc kubenswrapper[4936]: I0930 13:52:11.823461 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-chnlb" Sep 30 13:52:11 crc kubenswrapper[4936]: I0930 13:52:11.950873 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-pkkdt" Sep 30 13:52:12 crc kubenswrapper[4936]: I0930 13:52:12.100351 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-b5lzr" event={"ID":"6fc87ca3-ce0e-4976-b45f-cf28709a6f9f","Type":"ContainerStarted","Data":"9d223b70bb7e3877b157b9cda113f21e9bd3b4de6e8eea313132f3bdb4457edb"} Sep 30 13:52:12 crc kubenswrapper[4936]: I0930 13:52:12.102591 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6nl8s" event={"ID":"1b776e9f-dda7-4637-b414-c3a52784dfc0","Type":"ContainerStarted","Data":"05d297a7a67eacf493a7711bd9f238e1ff31693ab2a39b6f6b662bd2935127e3"} Sep 30 13:52:12 crc kubenswrapper[4936]: I0930 13:52:12.102662 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-6nl8s" podUID="1b776e9f-dda7-4637-b414-c3a52784dfc0" containerName="registry-server" containerID="cri-o://05d297a7a67eacf493a7711bd9f238e1ff31693ab2a39b6f6b662bd2935127e3" gracePeriod=2 Sep 30 13:52:12 crc kubenswrapper[4936]: I0930 13:52:12.116725 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-b5lzr" podStartSLOduration=2.1447925310000002 podStartE2EDuration="3.116709016s" podCreationTimestamp="2025-09-30 13:52:09 +0000 UTC" firstStartedPulling="2025-09-30 13:52:10.776632808 +0000 UTC m=+781.160635109" lastFinishedPulling="2025-09-30 13:52:11.748549293 +0000 UTC m=+782.132551594" observedRunningTime="2025-09-30 13:52:12.113973038 +0000 UTC m=+782.497975339" watchObservedRunningTime="2025-09-30 13:52:12.116709016 +0000 UTC m=+782.500711317" Sep 30 13:52:13 crc kubenswrapper[4936]: I0930 13:52:13.108647 4936 generic.go:334] "Generic (PLEG): container finished" podID="1b776e9f-dda7-4637-b414-c3a52784dfc0" containerID="05d297a7a67eacf493a7711bd9f238e1ff31693ab2a39b6f6b662bd2935127e3" exitCode=0 Sep 30 13:52:13 crc kubenswrapper[4936]: I0930 13:52:13.108722 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6nl8s" event={"ID":"1b776e9f-dda7-4637-b414-c3a52784dfc0","Type":"ContainerDied","Data":"05d297a7a67eacf493a7711bd9f238e1ff31693ab2a39b6f6b662bd2935127e3"} Sep 30 13:52:13 crc kubenswrapper[4936]: I0930 13:52:13.765661 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6nl8s" Sep 30 13:52:13 crc kubenswrapper[4936]: I0930 13:52:13.802373 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsmr8\" (UniqueName: \"kubernetes.io/projected/1b776e9f-dda7-4637-b414-c3a52784dfc0-kube-api-access-xsmr8\") pod \"1b776e9f-dda7-4637-b414-c3a52784dfc0\" (UID: \"1b776e9f-dda7-4637-b414-c3a52784dfc0\") " Sep 30 13:52:13 crc kubenswrapper[4936]: I0930 13:52:13.807510 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b776e9f-dda7-4637-b414-c3a52784dfc0-kube-api-access-xsmr8" (OuterVolumeSpecName: "kube-api-access-xsmr8") pod "1b776e9f-dda7-4637-b414-c3a52784dfc0" (UID: "1b776e9f-dda7-4637-b414-c3a52784dfc0"). InnerVolumeSpecName "kube-api-access-xsmr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:52:13 crc kubenswrapper[4936]: I0930 13:52:13.903201 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsmr8\" (UniqueName: \"kubernetes.io/projected/1b776e9f-dda7-4637-b414-c3a52784dfc0-kube-api-access-xsmr8\") on node \"crc\" DevicePath \"\"" Sep 30 13:52:14 crc kubenswrapper[4936]: I0930 13:52:14.117115 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6nl8s" event={"ID":"1b776e9f-dda7-4637-b414-c3a52784dfc0","Type":"ContainerDied","Data":"5355edd7a10afb25134cc168736b7cc78b7110e91005818ee7f59ccab93e43d8"} Sep 30 13:52:14 crc kubenswrapper[4936]: I0930 13:52:14.117174 4936 scope.go:117] "RemoveContainer" containerID="05d297a7a67eacf493a7711bd9f238e1ff31693ab2a39b6f6b662bd2935127e3" Sep 30 13:52:14 crc kubenswrapper[4936]: I0930 13:52:14.118423 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6nl8s" Sep 30 13:52:14 crc kubenswrapper[4936]: I0930 13:52:14.149290 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-6nl8s"] Sep 30 13:52:14 crc kubenswrapper[4936]: I0930 13:52:14.152806 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-6nl8s"] Sep 30 13:52:14 crc kubenswrapper[4936]: I0930 13:52:14.324231 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b776e9f-dda7-4637-b414-c3a52784dfc0" path="/var/lib/kubelet/pods/1b776e9f-dda7-4637-b414-c3a52784dfc0/volumes" Sep 30 13:52:20 crc kubenswrapper[4936]: I0930 13:52:20.062584 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-b5lzr" Sep 30 13:52:20 crc kubenswrapper[4936]: I0930 13:52:20.063072 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-b5lzr" Sep 30 13:52:20 crc kubenswrapper[4936]: I0930 13:52:20.086566 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-b5lzr" Sep 30 13:52:20 crc kubenswrapper[4936]: I0930 13:52:20.179620 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-b5lzr" Sep 30 13:52:21 crc kubenswrapper[4936]: I0930 13:52:21.762860 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2"] Sep 30 13:52:21 crc kubenswrapper[4936]: E0930 13:52:21.763397 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b776e9f-dda7-4637-b414-c3a52784dfc0" containerName="registry-server" Sep 30 13:52:21 crc kubenswrapper[4936]: I0930 13:52:21.763408 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b776e9f-dda7-4637-b414-c3a52784dfc0" containerName="registry-server" Sep 30 13:52:21 crc kubenswrapper[4936]: I0930 13:52:21.763514 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b776e9f-dda7-4637-b414-c3a52784dfc0" containerName="registry-server" Sep 30 13:52:21 crc kubenswrapper[4936]: I0930 13:52:21.764345 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2" Sep 30 13:52:21 crc kubenswrapper[4936]: I0930 13:52:21.766665 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-x2p72" Sep 30 13:52:21 crc kubenswrapper[4936]: I0930 13:52:21.774083 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2"] Sep 30 13:52:21 crc kubenswrapper[4936]: I0930 13:52:21.791196 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-j947b" Sep 30 13:52:21 crc kubenswrapper[4936]: I0930 13:52:21.805256 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/286303d8-20ec-45c4-86cf-3da1af48f329-util\") pod \"6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2\" (UID: \"286303d8-20ec-45c4-86cf-3da1af48f329\") " pod="openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2" Sep 30 13:52:21 crc kubenswrapper[4936]: I0930 13:52:21.805626 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlrwj\" (UniqueName: \"kubernetes.io/projected/286303d8-20ec-45c4-86cf-3da1af48f329-kube-api-access-nlrwj\") pod \"6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2\" (UID: \"286303d8-20ec-45c4-86cf-3da1af48f329\") " pod="openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2" Sep 30 13:52:21 crc kubenswrapper[4936]: I0930 13:52:21.805771 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/286303d8-20ec-45c4-86cf-3da1af48f329-bundle\") pod \"6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2\" (UID: \"286303d8-20ec-45c4-86cf-3da1af48f329\") " pod="openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2" Sep 30 13:52:21 crc kubenswrapper[4936]: I0930 13:52:21.907948 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/286303d8-20ec-45c4-86cf-3da1af48f329-bundle\") pod \"6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2\" (UID: \"286303d8-20ec-45c4-86cf-3da1af48f329\") " pod="openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2" Sep 30 13:52:21 crc kubenswrapper[4936]: I0930 13:52:21.908382 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/286303d8-20ec-45c4-86cf-3da1af48f329-util\") pod \"6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2\" (UID: \"286303d8-20ec-45c4-86cf-3da1af48f329\") " pod="openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2" Sep 30 13:52:21 crc kubenswrapper[4936]: I0930 13:52:21.908538 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlrwj\" (UniqueName: \"kubernetes.io/projected/286303d8-20ec-45c4-86cf-3da1af48f329-kube-api-access-nlrwj\") pod \"6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2\" (UID: \"286303d8-20ec-45c4-86cf-3da1af48f329\") " pod="openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2" Sep 30 13:52:21 crc kubenswrapper[4936]: I0930 13:52:21.910038 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/286303d8-20ec-45c4-86cf-3da1af48f329-bundle\") pod \"6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2\" (UID: \"286303d8-20ec-45c4-86cf-3da1af48f329\") " pod="openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2" Sep 30 13:52:21 crc kubenswrapper[4936]: I0930 13:52:21.910391 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/286303d8-20ec-45c4-86cf-3da1af48f329-util\") pod \"6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2\" (UID: \"286303d8-20ec-45c4-86cf-3da1af48f329\") " pod="openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2" Sep 30 13:52:21 crc kubenswrapper[4936]: I0930 13:52:21.934275 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlrwj\" (UniqueName: \"kubernetes.io/projected/286303d8-20ec-45c4-86cf-3da1af48f329-kube-api-access-nlrwj\") pod \"6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2\" (UID: \"286303d8-20ec-45c4-86cf-3da1af48f329\") " pod="openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2" Sep 30 13:52:22 crc kubenswrapper[4936]: I0930 13:52:22.081410 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2" Sep 30 13:52:22 crc kubenswrapper[4936]: I0930 13:52:22.472627 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2"] Sep 30 13:52:23 crc kubenswrapper[4936]: I0930 13:52:23.131841 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kxkfm"] Sep 30 13:52:23 crc kubenswrapper[4936]: I0930 13:52:23.133541 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxkfm" Sep 30 13:52:23 crc kubenswrapper[4936]: I0930 13:52:23.144577 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kxkfm"] Sep 30 13:52:23 crc kubenswrapper[4936]: I0930 13:52:23.181481 4936 generic.go:334] "Generic (PLEG): container finished" podID="286303d8-20ec-45c4-86cf-3da1af48f329" containerID="cde5ef3d5cca9584816fd3282e6e81953a23dff10e6a28ee35fa33bac9cc5518" exitCode=0 Sep 30 13:52:23 crc kubenswrapper[4936]: I0930 13:52:23.181524 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2" event={"ID":"286303d8-20ec-45c4-86cf-3da1af48f329","Type":"ContainerDied","Data":"cde5ef3d5cca9584816fd3282e6e81953a23dff10e6a28ee35fa33bac9cc5518"} Sep 30 13:52:23 crc kubenswrapper[4936]: I0930 13:52:23.181547 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2" event={"ID":"286303d8-20ec-45c4-86cf-3da1af48f329","Type":"ContainerStarted","Data":"3ec2e9da079773946047a2a50fa9b94a0fe26234c68faee257792fac275ed703"} Sep 30 13:52:23 crc kubenswrapper[4936]: I0930 13:52:23.226889 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b878dd97-f0d3-457d-8451-7518cfcc6fa6-utilities\") pod \"redhat-operators-kxkfm\" (UID: \"b878dd97-f0d3-457d-8451-7518cfcc6fa6\") " pod="openshift-marketplace/redhat-operators-kxkfm" Sep 30 13:52:23 crc kubenswrapper[4936]: I0930 13:52:23.226957 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b878dd97-f0d3-457d-8451-7518cfcc6fa6-catalog-content\") pod \"redhat-operators-kxkfm\" (UID: \"b878dd97-f0d3-457d-8451-7518cfcc6fa6\") " pod="openshift-marketplace/redhat-operators-kxkfm" Sep 30 13:52:23 crc kubenswrapper[4936]: I0930 13:52:23.226998 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2kxk\" (UniqueName: \"kubernetes.io/projected/b878dd97-f0d3-457d-8451-7518cfcc6fa6-kube-api-access-x2kxk\") pod \"redhat-operators-kxkfm\" (UID: \"b878dd97-f0d3-457d-8451-7518cfcc6fa6\") " pod="openshift-marketplace/redhat-operators-kxkfm" Sep 30 13:52:23 crc kubenswrapper[4936]: I0930 13:52:23.328418 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b878dd97-f0d3-457d-8451-7518cfcc6fa6-utilities\") pod \"redhat-operators-kxkfm\" (UID: \"b878dd97-f0d3-457d-8451-7518cfcc6fa6\") " pod="openshift-marketplace/redhat-operators-kxkfm" Sep 30 13:52:23 crc kubenswrapper[4936]: I0930 13:52:23.328481 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b878dd97-f0d3-457d-8451-7518cfcc6fa6-catalog-content\") pod \"redhat-operators-kxkfm\" (UID: \"b878dd97-f0d3-457d-8451-7518cfcc6fa6\") " pod="openshift-marketplace/redhat-operators-kxkfm" Sep 30 13:52:23 crc kubenswrapper[4936]: I0930 13:52:23.328511 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2kxk\" (UniqueName: \"kubernetes.io/projected/b878dd97-f0d3-457d-8451-7518cfcc6fa6-kube-api-access-x2kxk\") pod \"redhat-operators-kxkfm\" (UID: \"b878dd97-f0d3-457d-8451-7518cfcc6fa6\") " pod="openshift-marketplace/redhat-operators-kxkfm" Sep 30 13:52:23 crc kubenswrapper[4936]: I0930 13:52:23.328953 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b878dd97-f0d3-457d-8451-7518cfcc6fa6-utilities\") pod \"redhat-operators-kxkfm\" (UID: \"b878dd97-f0d3-457d-8451-7518cfcc6fa6\") " pod="openshift-marketplace/redhat-operators-kxkfm" Sep 30 13:52:23 crc kubenswrapper[4936]: I0930 13:52:23.328958 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b878dd97-f0d3-457d-8451-7518cfcc6fa6-catalog-content\") pod \"redhat-operators-kxkfm\" (UID: \"b878dd97-f0d3-457d-8451-7518cfcc6fa6\") " pod="openshift-marketplace/redhat-operators-kxkfm" Sep 30 13:52:23 crc kubenswrapper[4936]: I0930 13:52:23.349822 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2kxk\" (UniqueName: \"kubernetes.io/projected/b878dd97-f0d3-457d-8451-7518cfcc6fa6-kube-api-access-x2kxk\") pod \"redhat-operators-kxkfm\" (UID: \"b878dd97-f0d3-457d-8451-7518cfcc6fa6\") " pod="openshift-marketplace/redhat-operators-kxkfm" Sep 30 13:52:23 crc kubenswrapper[4936]: I0930 13:52:23.447122 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxkfm" Sep 30 13:52:23 crc kubenswrapper[4936]: I0930 13:52:23.721130 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kxkfm"] Sep 30 13:52:24 crc kubenswrapper[4936]: I0930 13:52:24.188730 4936 generic.go:334] "Generic (PLEG): container finished" podID="286303d8-20ec-45c4-86cf-3da1af48f329" containerID="19fd477be745825b28c9e4b073bd053cb6c2525a7743c628b44e7408a4df695b" exitCode=0 Sep 30 13:52:24 crc kubenswrapper[4936]: I0930 13:52:24.188809 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2" event={"ID":"286303d8-20ec-45c4-86cf-3da1af48f329","Type":"ContainerDied","Data":"19fd477be745825b28c9e4b073bd053cb6c2525a7743c628b44e7408a4df695b"} Sep 30 13:52:24 crc kubenswrapper[4936]: I0930 13:52:24.191574 4936 generic.go:334] "Generic (PLEG): container finished" podID="b878dd97-f0d3-457d-8451-7518cfcc6fa6" containerID="76660f3b9aac95ac0d655f8d0894e7afbbccd1bde5b9c7ddfac032c4080a77b2" exitCode=0 Sep 30 13:52:24 crc kubenswrapper[4936]: I0930 13:52:24.191614 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxkfm" event={"ID":"b878dd97-f0d3-457d-8451-7518cfcc6fa6","Type":"ContainerDied","Data":"76660f3b9aac95ac0d655f8d0894e7afbbccd1bde5b9c7ddfac032c4080a77b2"} Sep 30 13:52:24 crc kubenswrapper[4936]: I0930 13:52:24.191641 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxkfm" event={"ID":"b878dd97-f0d3-457d-8451-7518cfcc6fa6","Type":"ContainerStarted","Data":"bf2b42fd2378bd9bc4edfa6cbc71947512799c879017818b19c4b3af950f773e"} Sep 30 13:52:25 crc kubenswrapper[4936]: I0930 13:52:25.200106 4936 generic.go:334] "Generic (PLEG): container finished" podID="286303d8-20ec-45c4-86cf-3da1af48f329" containerID="afaa68dd67da44975c90e6172a82d773f2aab58e440969e5e549bcda7f787448" exitCode=0 Sep 30 13:52:25 crc kubenswrapper[4936]: I0930 13:52:25.200161 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2" event={"ID":"286303d8-20ec-45c4-86cf-3da1af48f329","Type":"ContainerDied","Data":"afaa68dd67da44975c90e6172a82d773f2aab58e440969e5e549bcda7f787448"} Sep 30 13:52:26 crc kubenswrapper[4936]: I0930 13:52:26.206218 4936 generic.go:334] "Generic (PLEG): container finished" podID="b878dd97-f0d3-457d-8451-7518cfcc6fa6" containerID="4d06d5f7df8d4801dda336595a8bc976719e29d5a1f52d6bca589c76581b6826" exitCode=0 Sep 30 13:52:26 crc kubenswrapper[4936]: I0930 13:52:26.206348 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxkfm" event={"ID":"b878dd97-f0d3-457d-8451-7518cfcc6fa6","Type":"ContainerDied","Data":"4d06d5f7df8d4801dda336595a8bc976719e29d5a1f52d6bca589c76581b6826"} Sep 30 13:52:26 crc kubenswrapper[4936]: I0930 13:52:26.464068 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2" Sep 30 13:52:26 crc kubenswrapper[4936]: I0930 13:52:26.568291 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/286303d8-20ec-45c4-86cf-3da1af48f329-bundle\") pod \"286303d8-20ec-45c4-86cf-3da1af48f329\" (UID: \"286303d8-20ec-45c4-86cf-3da1af48f329\") " Sep 30 13:52:26 crc kubenswrapper[4936]: I0930 13:52:26.568414 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlrwj\" (UniqueName: \"kubernetes.io/projected/286303d8-20ec-45c4-86cf-3da1af48f329-kube-api-access-nlrwj\") pod \"286303d8-20ec-45c4-86cf-3da1af48f329\" (UID: \"286303d8-20ec-45c4-86cf-3da1af48f329\") " Sep 30 13:52:26 crc kubenswrapper[4936]: I0930 13:52:26.568528 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/286303d8-20ec-45c4-86cf-3da1af48f329-util\") pod \"286303d8-20ec-45c4-86cf-3da1af48f329\" (UID: \"286303d8-20ec-45c4-86cf-3da1af48f329\") " Sep 30 13:52:26 crc kubenswrapper[4936]: I0930 13:52:26.569162 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/286303d8-20ec-45c4-86cf-3da1af48f329-bundle" (OuterVolumeSpecName: "bundle") pod "286303d8-20ec-45c4-86cf-3da1af48f329" (UID: "286303d8-20ec-45c4-86cf-3da1af48f329"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:52:26 crc kubenswrapper[4936]: I0930 13:52:26.573625 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/286303d8-20ec-45c4-86cf-3da1af48f329-kube-api-access-nlrwj" (OuterVolumeSpecName: "kube-api-access-nlrwj") pod "286303d8-20ec-45c4-86cf-3da1af48f329" (UID: "286303d8-20ec-45c4-86cf-3da1af48f329"). InnerVolumeSpecName "kube-api-access-nlrwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:52:26 crc kubenswrapper[4936]: I0930 13:52:26.586876 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/286303d8-20ec-45c4-86cf-3da1af48f329-util" (OuterVolumeSpecName: "util") pod "286303d8-20ec-45c4-86cf-3da1af48f329" (UID: "286303d8-20ec-45c4-86cf-3da1af48f329"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:52:26 crc kubenswrapper[4936]: I0930 13:52:26.669688 4936 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/286303d8-20ec-45c4-86cf-3da1af48f329-util\") on node \"crc\" DevicePath \"\"" Sep 30 13:52:26 crc kubenswrapper[4936]: I0930 13:52:26.669720 4936 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/286303d8-20ec-45c4-86cf-3da1af48f329-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:52:26 crc kubenswrapper[4936]: I0930 13:52:26.669731 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlrwj\" (UniqueName: \"kubernetes.io/projected/286303d8-20ec-45c4-86cf-3da1af48f329-kube-api-access-nlrwj\") on node \"crc\" DevicePath \"\"" Sep 30 13:52:27 crc kubenswrapper[4936]: I0930 13:52:27.214230 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxkfm" event={"ID":"b878dd97-f0d3-457d-8451-7518cfcc6fa6","Type":"ContainerStarted","Data":"b39b8343da7a5be57e1ac883e90fd0c8e770c901caff759c8ad4c25fd1c619ce"} Sep 30 13:52:27 crc kubenswrapper[4936]: I0930 13:52:27.217029 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2" event={"ID":"286303d8-20ec-45c4-86cf-3da1af48f329","Type":"ContainerDied","Data":"3ec2e9da079773946047a2a50fa9b94a0fe26234c68faee257792fac275ed703"} Sep 30 13:52:27 crc kubenswrapper[4936]: I0930 13:52:27.217064 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ec2e9da079773946047a2a50fa9b94a0fe26234c68faee257792fac275ed703" Sep 30 13:52:27 crc kubenswrapper[4936]: I0930 13:52:27.217090 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2" Sep 30 13:52:27 crc kubenswrapper[4936]: I0930 13:52:27.232665 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kxkfm" podStartSLOduration=1.7712110779999999 podStartE2EDuration="4.23265071s" podCreationTimestamp="2025-09-30 13:52:23 +0000 UTC" firstStartedPulling="2025-09-30 13:52:24.193976146 +0000 UTC m=+794.577978437" lastFinishedPulling="2025-09-30 13:52:26.655415768 +0000 UTC m=+797.039418069" observedRunningTime="2025-09-30 13:52:27.22982884 +0000 UTC m=+797.613831151" watchObservedRunningTime="2025-09-30 13:52:27.23265071 +0000 UTC m=+797.616653011" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.132470 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cxlzj"] Sep 30 13:52:30 crc kubenswrapper[4936]: E0930 13:52:30.133082 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="286303d8-20ec-45c4-86cf-3da1af48f329" containerName="pull" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.133097 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="286303d8-20ec-45c4-86cf-3da1af48f329" containerName="pull" Sep 30 13:52:30 crc kubenswrapper[4936]: E0930 13:52:30.133110 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="286303d8-20ec-45c4-86cf-3da1af48f329" containerName="extract" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.133118 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="286303d8-20ec-45c4-86cf-3da1af48f329" containerName="extract" Sep 30 13:52:30 crc kubenswrapper[4936]: E0930 13:52:30.133136 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="286303d8-20ec-45c4-86cf-3da1af48f329" containerName="util" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.133145 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="286303d8-20ec-45c4-86cf-3da1af48f329" containerName="util" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.133305 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="286303d8-20ec-45c4-86cf-3da1af48f329" containerName="extract" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.135121 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxlzj" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.161197 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxlzj"] Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.213939 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e580eb8e-d1f7-49b6-8633-1cb097bd1d7a-catalog-content\") pod \"redhat-marketplace-cxlzj\" (UID: \"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a\") " pod="openshift-marketplace/redhat-marketplace-cxlzj" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.213996 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e580eb8e-d1f7-49b6-8633-1cb097bd1d7a-utilities\") pod \"redhat-marketplace-cxlzj\" (UID: \"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a\") " pod="openshift-marketplace/redhat-marketplace-cxlzj" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.214054 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmr8n\" (UniqueName: \"kubernetes.io/projected/e580eb8e-d1f7-49b6-8633-1cb097bd1d7a-kube-api-access-zmr8n\") pod \"redhat-marketplace-cxlzj\" (UID: \"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a\") " pod="openshift-marketplace/redhat-marketplace-cxlzj" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.226442 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-69769bbb6-9mvrz"] Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.227489 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-69769bbb6-9mvrz" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.238169 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-tqddl" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.267908 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-69769bbb6-9mvrz"] Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.316778 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmr8n\" (UniqueName: \"kubernetes.io/projected/e580eb8e-d1f7-49b6-8633-1cb097bd1d7a-kube-api-access-zmr8n\") pod \"redhat-marketplace-cxlzj\" (UID: \"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a\") " pod="openshift-marketplace/redhat-marketplace-cxlzj" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.316869 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e580eb8e-d1f7-49b6-8633-1cb097bd1d7a-catalog-content\") pod \"redhat-marketplace-cxlzj\" (UID: \"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a\") " pod="openshift-marketplace/redhat-marketplace-cxlzj" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.316888 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e580eb8e-d1f7-49b6-8633-1cb097bd1d7a-utilities\") pod \"redhat-marketplace-cxlzj\" (UID: \"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a\") " pod="openshift-marketplace/redhat-marketplace-cxlzj" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.316927 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhw6q\" (UniqueName: \"kubernetes.io/projected/4ea53f04-2776-4e45-9444-6255d7fd2860-kube-api-access-xhw6q\") pod \"openstack-operator-controller-operator-69769bbb6-9mvrz\" (UID: \"4ea53f04-2776-4e45-9444-6255d7fd2860\") " pod="openstack-operators/openstack-operator-controller-operator-69769bbb6-9mvrz" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.317660 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e580eb8e-d1f7-49b6-8633-1cb097bd1d7a-catalog-content\") pod \"redhat-marketplace-cxlzj\" (UID: \"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a\") " pod="openshift-marketplace/redhat-marketplace-cxlzj" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.317885 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e580eb8e-d1f7-49b6-8633-1cb097bd1d7a-utilities\") pod \"redhat-marketplace-cxlzj\" (UID: \"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a\") " pod="openshift-marketplace/redhat-marketplace-cxlzj" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.363678 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmr8n\" (UniqueName: \"kubernetes.io/projected/e580eb8e-d1f7-49b6-8633-1cb097bd1d7a-kube-api-access-zmr8n\") pod \"redhat-marketplace-cxlzj\" (UID: \"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a\") " pod="openshift-marketplace/redhat-marketplace-cxlzj" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.418689 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhw6q\" (UniqueName: \"kubernetes.io/projected/4ea53f04-2776-4e45-9444-6255d7fd2860-kube-api-access-xhw6q\") pod \"openstack-operator-controller-operator-69769bbb6-9mvrz\" (UID: \"4ea53f04-2776-4e45-9444-6255d7fd2860\") " pod="openstack-operators/openstack-operator-controller-operator-69769bbb6-9mvrz" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.450164 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhw6q\" (UniqueName: \"kubernetes.io/projected/4ea53f04-2776-4e45-9444-6255d7fd2860-kube-api-access-xhw6q\") pod \"openstack-operator-controller-operator-69769bbb6-9mvrz\" (UID: \"4ea53f04-2776-4e45-9444-6255d7fd2860\") " pod="openstack-operators/openstack-operator-controller-operator-69769bbb6-9mvrz" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.465611 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxlzj" Sep 30 13:52:30 crc kubenswrapper[4936]: I0930 13:52:30.552148 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-69769bbb6-9mvrz" Sep 30 13:52:31 crc kubenswrapper[4936]: I0930 13:52:31.169256 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-69769bbb6-9mvrz"] Sep 30 13:52:31 crc kubenswrapper[4936]: W0930 13:52:31.174303 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ea53f04_2776_4e45_9444_6255d7fd2860.slice/crio-37ec8e7e5a3e5ec9f3bba9ccd87f561d1d6d270a6a5b696b59d8da494c40ba5c WatchSource:0}: Error finding container 37ec8e7e5a3e5ec9f3bba9ccd87f561d1d6d270a6a5b696b59d8da494c40ba5c: Status 404 returned error can't find the container with id 37ec8e7e5a3e5ec9f3bba9ccd87f561d1d6d270a6a5b696b59d8da494c40ba5c Sep 30 13:52:31 crc kubenswrapper[4936]: I0930 13:52:31.228785 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxlzj"] Sep 30 13:52:31 crc kubenswrapper[4936]: I0930 13:52:31.256422 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-69769bbb6-9mvrz" event={"ID":"4ea53f04-2776-4e45-9444-6255d7fd2860","Type":"ContainerStarted","Data":"37ec8e7e5a3e5ec9f3bba9ccd87f561d1d6d270a6a5b696b59d8da494c40ba5c"} Sep 30 13:52:31 crc kubenswrapper[4936]: I0930 13:52:31.257286 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxlzj" event={"ID":"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a","Type":"ContainerStarted","Data":"c3a3673c794ad369d72bc0eb1be7c699359665e4ae2c2063fc76fc827d2e8d3b"} Sep 30 13:52:32 crc kubenswrapper[4936]: I0930 13:52:32.272792 4936 generic.go:334] "Generic (PLEG): container finished" podID="e580eb8e-d1f7-49b6-8633-1cb097bd1d7a" containerID="6023d84b785e2948b81acfb8b77279935e513cd2c09ae4de8126a77399d957a0" exitCode=0 Sep 30 13:52:32 crc kubenswrapper[4936]: I0930 13:52:32.272871 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxlzj" event={"ID":"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a","Type":"ContainerDied","Data":"6023d84b785e2948b81acfb8b77279935e513cd2c09ae4de8126a77399d957a0"} Sep 30 13:52:33 crc kubenswrapper[4936]: I0930 13:52:33.447814 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kxkfm" Sep 30 13:52:33 crc kubenswrapper[4936]: I0930 13:52:33.448203 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kxkfm" Sep 30 13:52:33 crc kubenswrapper[4936]: I0930 13:52:33.527156 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kxkfm" Sep 30 13:52:34 crc kubenswrapper[4936]: I0930 13:52:34.331986 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kxkfm" Sep 30 13:52:37 crc kubenswrapper[4936]: I0930 13:52:37.122216 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kxkfm"] Sep 30 13:52:37 crc kubenswrapper[4936]: I0930 13:52:37.122738 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kxkfm" podUID="b878dd97-f0d3-457d-8451-7518cfcc6fa6" containerName="registry-server" containerID="cri-o://b39b8343da7a5be57e1ac883e90fd0c8e770c901caff759c8ad4c25fd1c619ce" gracePeriod=2 Sep 30 13:52:39 crc kubenswrapper[4936]: E0930 13:52:39.449182 4936 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb878dd97_f0d3_457d_8451_7518cfcc6fa6.slice/crio-conmon-b39b8343da7a5be57e1ac883e90fd0c8e770c901caff759c8ad4c25fd1c619ce.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:52:40 crc kubenswrapper[4936]: I0930 13:52:40.322314 4936 generic.go:334] "Generic (PLEG): container finished" podID="b878dd97-f0d3-457d-8451-7518cfcc6fa6" containerID="b39b8343da7a5be57e1ac883e90fd0c8e770c901caff759c8ad4c25fd1c619ce" exitCode=0 Sep 30 13:52:40 crc kubenswrapper[4936]: I0930 13:52:40.322476 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxkfm" event={"ID":"b878dd97-f0d3-457d-8451-7518cfcc6fa6","Type":"ContainerDied","Data":"b39b8343da7a5be57e1ac883e90fd0c8e770c901caff759c8ad4c25fd1c619ce"} Sep 30 13:52:41 crc kubenswrapper[4936]: I0930 13:52:41.366991 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxkfm" Sep 30 13:52:41 crc kubenswrapper[4936]: I0930 13:52:41.409922 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2kxk\" (UniqueName: \"kubernetes.io/projected/b878dd97-f0d3-457d-8451-7518cfcc6fa6-kube-api-access-x2kxk\") pod \"b878dd97-f0d3-457d-8451-7518cfcc6fa6\" (UID: \"b878dd97-f0d3-457d-8451-7518cfcc6fa6\") " Sep 30 13:52:41 crc kubenswrapper[4936]: I0930 13:52:41.410011 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b878dd97-f0d3-457d-8451-7518cfcc6fa6-catalog-content\") pod \"b878dd97-f0d3-457d-8451-7518cfcc6fa6\" (UID: \"b878dd97-f0d3-457d-8451-7518cfcc6fa6\") " Sep 30 13:52:41 crc kubenswrapper[4936]: I0930 13:52:41.410100 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b878dd97-f0d3-457d-8451-7518cfcc6fa6-utilities\") pod \"b878dd97-f0d3-457d-8451-7518cfcc6fa6\" (UID: \"b878dd97-f0d3-457d-8451-7518cfcc6fa6\") " Sep 30 13:52:41 crc kubenswrapper[4936]: I0930 13:52:41.411856 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b878dd97-f0d3-457d-8451-7518cfcc6fa6-utilities" (OuterVolumeSpecName: "utilities") pod "b878dd97-f0d3-457d-8451-7518cfcc6fa6" (UID: "b878dd97-f0d3-457d-8451-7518cfcc6fa6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:52:41 crc kubenswrapper[4936]: I0930 13:52:41.415887 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b878dd97-f0d3-457d-8451-7518cfcc6fa6-kube-api-access-x2kxk" (OuterVolumeSpecName: "kube-api-access-x2kxk") pod "b878dd97-f0d3-457d-8451-7518cfcc6fa6" (UID: "b878dd97-f0d3-457d-8451-7518cfcc6fa6"). InnerVolumeSpecName "kube-api-access-x2kxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:52:41 crc kubenswrapper[4936]: I0930 13:52:41.488423 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b878dd97-f0d3-457d-8451-7518cfcc6fa6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b878dd97-f0d3-457d-8451-7518cfcc6fa6" (UID: "b878dd97-f0d3-457d-8451-7518cfcc6fa6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:52:41 crc kubenswrapper[4936]: I0930 13:52:41.511674 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b878dd97-f0d3-457d-8451-7518cfcc6fa6-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:52:41 crc kubenswrapper[4936]: I0930 13:52:41.511713 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2kxk\" (UniqueName: \"kubernetes.io/projected/b878dd97-f0d3-457d-8451-7518cfcc6fa6-kube-api-access-x2kxk\") on node \"crc\" DevicePath \"\"" Sep 30 13:52:41 crc kubenswrapper[4936]: I0930 13:52:41.511735 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b878dd97-f0d3-457d-8451-7518cfcc6fa6-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:52:42 crc kubenswrapper[4936]: I0930 13:52:42.343073 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxkfm" event={"ID":"b878dd97-f0d3-457d-8451-7518cfcc6fa6","Type":"ContainerDied","Data":"bf2b42fd2378bd9bc4edfa6cbc71947512799c879017818b19c4b3af950f773e"} Sep 30 13:52:42 crc kubenswrapper[4936]: I0930 13:52:42.343484 4936 scope.go:117] "RemoveContainer" containerID="b39b8343da7a5be57e1ac883e90fd0c8e770c901caff759c8ad4c25fd1c619ce" Sep 30 13:52:42 crc kubenswrapper[4936]: I0930 13:52:42.343142 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxkfm" Sep 30 13:52:42 crc kubenswrapper[4936]: I0930 13:52:42.367820 4936 scope.go:117] "RemoveContainer" containerID="4d06d5f7df8d4801dda336595a8bc976719e29d5a1f52d6bca589c76581b6826" Sep 30 13:52:42 crc kubenswrapper[4936]: I0930 13:52:42.371169 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kxkfm"] Sep 30 13:52:42 crc kubenswrapper[4936]: I0930 13:52:42.377044 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kxkfm"] Sep 30 13:52:42 crc kubenswrapper[4936]: I0930 13:52:42.386618 4936 scope.go:117] "RemoveContainer" containerID="76660f3b9aac95ac0d655f8d0894e7afbbccd1bde5b9c7ddfac032c4080a77b2" Sep 30 13:52:44 crc kubenswrapper[4936]: I0930 13:52:44.325865 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b878dd97-f0d3-457d-8451-7518cfcc6fa6" path="/var/lib/kubelet/pods/b878dd97-f0d3-457d-8451-7518cfcc6fa6/volumes" Sep 30 13:52:48 crc kubenswrapper[4936]: I0930 13:52:48.376927 4936 generic.go:334] "Generic (PLEG): container finished" podID="e580eb8e-d1f7-49b6-8633-1cb097bd1d7a" containerID="d9573bc07fdddbb06a8490b98d820d64b4376b66b9a6ae37b1afae70e5ac05a0" exitCode=0 Sep 30 13:52:48 crc kubenswrapper[4936]: I0930 13:52:48.377125 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxlzj" event={"ID":"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a","Type":"ContainerDied","Data":"d9573bc07fdddbb06a8490b98d820d64b4376b66b9a6ae37b1afae70e5ac05a0"} Sep 30 13:52:48 crc kubenswrapper[4936]: I0930 13:52:48.379145 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-69769bbb6-9mvrz" event={"ID":"4ea53f04-2776-4e45-9444-6255d7fd2860","Type":"ContainerStarted","Data":"c800f820d76cfadd74e9e2b8cc992ab4fd46786893491477454b8928ef2c5058"} Sep 30 13:52:51 crc kubenswrapper[4936]: I0930 13:52:51.400541 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxlzj" event={"ID":"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a","Type":"ContainerStarted","Data":"02d6f3f837e3c793b193d841f599afd5f625ba582404c68e475245af9da2f560"} Sep 30 13:52:51 crc kubenswrapper[4936]: I0930 13:52:51.416050 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cxlzj" podStartSLOduration=3.097855731 podStartE2EDuration="21.41603219s" podCreationTimestamp="2025-09-30 13:52:30 +0000 UTC" firstStartedPulling="2025-09-30 13:52:32.275125127 +0000 UTC m=+802.659127428" lastFinishedPulling="2025-09-30 13:52:50.593301586 +0000 UTC m=+820.977303887" observedRunningTime="2025-09-30 13:52:51.414617889 +0000 UTC m=+821.798620210" watchObservedRunningTime="2025-09-30 13:52:51.41603219 +0000 UTC m=+821.800034491" Sep 30 13:52:57 crc kubenswrapper[4936]: I0930 13:52:57.444371 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-69769bbb6-9mvrz" event={"ID":"4ea53f04-2776-4e45-9444-6255d7fd2860","Type":"ContainerStarted","Data":"9e53d65abdd1092c15adbd2f5869f72c68d6dbbe4c6bade52dae07338f49c028"} Sep 30 13:52:57 crc kubenswrapper[4936]: I0930 13:52:57.444846 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-69769bbb6-9mvrz" Sep 30 13:52:57 crc kubenswrapper[4936]: I0930 13:52:57.446672 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-69769bbb6-9mvrz" Sep 30 13:52:57 crc kubenswrapper[4936]: I0930 13:52:57.479178 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-69769bbb6-9mvrz" podStartSLOduration=2.083908033 podStartE2EDuration="27.479163405s" podCreationTimestamp="2025-09-30 13:52:30 +0000 UTC" firstStartedPulling="2025-09-30 13:52:31.176635061 +0000 UTC m=+801.560637352" lastFinishedPulling="2025-09-30 13:52:56.571890423 +0000 UTC m=+826.955892724" observedRunningTime="2025-09-30 13:52:57.477809367 +0000 UTC m=+827.861811688" watchObservedRunningTime="2025-09-30 13:52:57.479163405 +0000 UTC m=+827.863165706" Sep 30 13:53:00 crc kubenswrapper[4936]: I0930 13:53:00.466099 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cxlzj" Sep 30 13:53:00 crc kubenswrapper[4936]: I0930 13:53:00.466734 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cxlzj" Sep 30 13:53:00 crc kubenswrapper[4936]: I0930 13:53:00.502761 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cxlzj" Sep 30 13:53:01 crc kubenswrapper[4936]: I0930 13:53:01.507928 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cxlzj" Sep 30 13:53:01 crc kubenswrapper[4936]: I0930 13:53:01.547694 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxlzj"] Sep 30 13:53:03 crc kubenswrapper[4936]: I0930 13:53:03.474451 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cxlzj" podUID="e580eb8e-d1f7-49b6-8633-1cb097bd1d7a" containerName="registry-server" containerID="cri-o://02d6f3f837e3c793b193d841f599afd5f625ba582404c68e475245af9da2f560" gracePeriod=2 Sep 30 13:53:04 crc kubenswrapper[4936]: I0930 13:53:04.494986 4936 generic.go:334] "Generic (PLEG): container finished" podID="e580eb8e-d1f7-49b6-8633-1cb097bd1d7a" containerID="02d6f3f837e3c793b193d841f599afd5f625ba582404c68e475245af9da2f560" exitCode=0 Sep 30 13:53:04 crc kubenswrapper[4936]: I0930 13:53:04.495035 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxlzj" event={"ID":"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a","Type":"ContainerDied","Data":"02d6f3f837e3c793b193d841f599afd5f625ba582404c68e475245af9da2f560"} Sep 30 13:53:04 crc kubenswrapper[4936]: I0930 13:53:04.706292 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxlzj" Sep 30 13:53:04 crc kubenswrapper[4936]: I0930 13:53:04.731457 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e580eb8e-d1f7-49b6-8633-1cb097bd1d7a-catalog-content\") pod \"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a\" (UID: \"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a\") " Sep 30 13:53:04 crc kubenswrapper[4936]: I0930 13:53:04.731553 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e580eb8e-d1f7-49b6-8633-1cb097bd1d7a-utilities\") pod \"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a\" (UID: \"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a\") " Sep 30 13:53:04 crc kubenswrapper[4936]: I0930 13:53:04.731605 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmr8n\" (UniqueName: \"kubernetes.io/projected/e580eb8e-d1f7-49b6-8633-1cb097bd1d7a-kube-api-access-zmr8n\") pod \"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a\" (UID: \"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a\") " Sep 30 13:53:04 crc kubenswrapper[4936]: I0930 13:53:04.732388 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e580eb8e-d1f7-49b6-8633-1cb097bd1d7a-utilities" (OuterVolumeSpecName: "utilities") pod "e580eb8e-d1f7-49b6-8633-1cb097bd1d7a" (UID: "e580eb8e-d1f7-49b6-8633-1cb097bd1d7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:53:04 crc kubenswrapper[4936]: I0930 13:53:04.741769 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e580eb8e-d1f7-49b6-8633-1cb097bd1d7a-kube-api-access-zmr8n" (OuterVolumeSpecName: "kube-api-access-zmr8n") pod "e580eb8e-d1f7-49b6-8633-1cb097bd1d7a" (UID: "e580eb8e-d1f7-49b6-8633-1cb097bd1d7a"). InnerVolumeSpecName "kube-api-access-zmr8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:04 crc kubenswrapper[4936]: I0930 13:53:04.749822 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e580eb8e-d1f7-49b6-8633-1cb097bd1d7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e580eb8e-d1f7-49b6-8633-1cb097bd1d7a" (UID: "e580eb8e-d1f7-49b6-8633-1cb097bd1d7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:53:04 crc kubenswrapper[4936]: I0930 13:53:04.833177 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e580eb8e-d1f7-49b6-8633-1cb097bd1d7a-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:04 crc kubenswrapper[4936]: I0930 13:53:04.833210 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e580eb8e-d1f7-49b6-8633-1cb097bd1d7a-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:04 crc kubenswrapper[4936]: I0930 13:53:04.833223 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmr8n\" (UniqueName: \"kubernetes.io/projected/e580eb8e-d1f7-49b6-8633-1cb097bd1d7a-kube-api-access-zmr8n\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:05 crc kubenswrapper[4936]: I0930 13:53:05.502923 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxlzj" event={"ID":"e580eb8e-d1f7-49b6-8633-1cb097bd1d7a","Type":"ContainerDied","Data":"c3a3673c794ad369d72bc0eb1be7c699359665e4ae2c2063fc76fc827d2e8d3b"} Sep 30 13:53:05 crc kubenswrapper[4936]: I0930 13:53:05.502961 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxlzj" Sep 30 13:53:05 crc kubenswrapper[4936]: I0930 13:53:05.502987 4936 scope.go:117] "RemoveContainer" containerID="02d6f3f837e3c793b193d841f599afd5f625ba582404c68e475245af9da2f560" Sep 30 13:53:05 crc kubenswrapper[4936]: I0930 13:53:05.527125 4936 scope.go:117] "RemoveContainer" containerID="d9573bc07fdddbb06a8490b98d820d64b4376b66b9a6ae37b1afae70e5ac05a0" Sep 30 13:53:05 crc kubenswrapper[4936]: I0930 13:53:05.538524 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxlzj"] Sep 30 13:53:05 crc kubenswrapper[4936]: I0930 13:53:05.544741 4936 scope.go:117] "RemoveContainer" containerID="6023d84b785e2948b81acfb8b77279935e513cd2c09ae4de8126a77399d957a0" Sep 30 13:53:05 crc kubenswrapper[4936]: I0930 13:53:05.562352 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxlzj"] Sep 30 13:53:06 crc kubenswrapper[4936]: I0930 13:53:06.324185 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e580eb8e-d1f7-49b6-8633-1cb097bd1d7a" path="/var/lib/kubelet/pods/e580eb8e-d1f7-49b6-8633-1cb097bd1d7a/volumes" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.126267 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fwh6j"] Sep 30 13:53:11 crc kubenswrapper[4936]: E0930 13:53:11.128025 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e580eb8e-d1f7-49b6-8633-1cb097bd1d7a" containerName="extract-utilities" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.128094 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="e580eb8e-d1f7-49b6-8633-1cb097bd1d7a" containerName="extract-utilities" Sep 30 13:53:11 crc kubenswrapper[4936]: E0930 13:53:11.128274 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b878dd97-f0d3-457d-8451-7518cfcc6fa6" containerName="registry-server" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.128328 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b878dd97-f0d3-457d-8451-7518cfcc6fa6" containerName="registry-server" Sep 30 13:53:11 crc kubenswrapper[4936]: E0930 13:53:11.128418 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b878dd97-f0d3-457d-8451-7518cfcc6fa6" containerName="extract-content" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.128469 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b878dd97-f0d3-457d-8451-7518cfcc6fa6" containerName="extract-content" Sep 30 13:53:11 crc kubenswrapper[4936]: E0930 13:53:11.128518 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e580eb8e-d1f7-49b6-8633-1cb097bd1d7a" containerName="extract-content" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.128573 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="e580eb8e-d1f7-49b6-8633-1cb097bd1d7a" containerName="extract-content" Sep 30 13:53:11 crc kubenswrapper[4936]: E0930 13:53:11.128641 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b878dd97-f0d3-457d-8451-7518cfcc6fa6" containerName="extract-utilities" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.128695 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b878dd97-f0d3-457d-8451-7518cfcc6fa6" containerName="extract-utilities" Sep 30 13:53:11 crc kubenswrapper[4936]: E0930 13:53:11.128749 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e580eb8e-d1f7-49b6-8633-1cb097bd1d7a" containerName="registry-server" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.128797 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="e580eb8e-d1f7-49b6-8633-1cb097bd1d7a" containerName="registry-server" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.128965 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="e580eb8e-d1f7-49b6-8633-1cb097bd1d7a" containerName="registry-server" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.129022 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b878dd97-f0d3-457d-8451-7518cfcc6fa6" containerName="registry-server" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.129923 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fwh6j" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.134672 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fwh6j"] Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.247233 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8z4s\" (UniqueName: \"kubernetes.io/projected/a2866263-7d1e-47c3-8f8f-25bb43505e5e-kube-api-access-r8z4s\") pod \"certified-operators-fwh6j\" (UID: \"a2866263-7d1e-47c3-8f8f-25bb43505e5e\") " pod="openshift-marketplace/certified-operators-fwh6j" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.247278 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2866263-7d1e-47c3-8f8f-25bb43505e5e-utilities\") pod \"certified-operators-fwh6j\" (UID: \"a2866263-7d1e-47c3-8f8f-25bb43505e5e\") " pod="openshift-marketplace/certified-operators-fwh6j" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.247320 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2866263-7d1e-47c3-8f8f-25bb43505e5e-catalog-content\") pod \"certified-operators-fwh6j\" (UID: \"a2866263-7d1e-47c3-8f8f-25bb43505e5e\") " pod="openshift-marketplace/certified-operators-fwh6j" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.349027 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8z4s\" (UniqueName: \"kubernetes.io/projected/a2866263-7d1e-47c3-8f8f-25bb43505e5e-kube-api-access-r8z4s\") pod \"certified-operators-fwh6j\" (UID: \"a2866263-7d1e-47c3-8f8f-25bb43505e5e\") " pod="openshift-marketplace/certified-operators-fwh6j" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.349311 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2866263-7d1e-47c3-8f8f-25bb43505e5e-utilities\") pod \"certified-operators-fwh6j\" (UID: \"a2866263-7d1e-47c3-8f8f-25bb43505e5e\") " pod="openshift-marketplace/certified-operators-fwh6j" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.350101 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2866263-7d1e-47c3-8f8f-25bb43505e5e-utilities\") pod \"certified-operators-fwh6j\" (UID: \"a2866263-7d1e-47c3-8f8f-25bb43505e5e\") " pod="openshift-marketplace/certified-operators-fwh6j" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.349413 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2866263-7d1e-47c3-8f8f-25bb43505e5e-catalog-content\") pod \"certified-operators-fwh6j\" (UID: \"a2866263-7d1e-47c3-8f8f-25bb43505e5e\") " pod="openshift-marketplace/certified-operators-fwh6j" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.350213 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2866263-7d1e-47c3-8f8f-25bb43505e5e-catalog-content\") pod \"certified-operators-fwh6j\" (UID: \"a2866263-7d1e-47c3-8f8f-25bb43505e5e\") " pod="openshift-marketplace/certified-operators-fwh6j" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.378502 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8z4s\" (UniqueName: \"kubernetes.io/projected/a2866263-7d1e-47c3-8f8f-25bb43505e5e-kube-api-access-r8z4s\") pod \"certified-operators-fwh6j\" (UID: \"a2866263-7d1e-47c3-8f8f-25bb43505e5e\") " pod="openshift-marketplace/certified-operators-fwh6j" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.444107 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fwh6j" Sep 30 13:53:11 crc kubenswrapper[4936]: I0930 13:53:11.841098 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fwh6j"] Sep 30 13:53:11 crc kubenswrapper[4936]: W0930 13:53:11.848735 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2866263_7d1e_47c3_8f8f_25bb43505e5e.slice/crio-95b9c79ae1551c262581571be1a8ec596992214019c645f36c9a8e514eab3850 WatchSource:0}: Error finding container 95b9c79ae1551c262581571be1a8ec596992214019c645f36c9a8e514eab3850: Status 404 returned error can't find the container with id 95b9c79ae1551c262581571be1a8ec596992214019c645f36c9a8e514eab3850 Sep 30 13:53:12 crc kubenswrapper[4936]: I0930 13:53:12.541859 4936 generic.go:334] "Generic (PLEG): container finished" podID="a2866263-7d1e-47c3-8f8f-25bb43505e5e" containerID="b51d821947355c45a982a5cfd0f06c341f0dbb600f2fdca3c963c983f65cfc2c" exitCode=0 Sep 30 13:53:12 crc kubenswrapper[4936]: I0930 13:53:12.542135 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwh6j" event={"ID":"a2866263-7d1e-47c3-8f8f-25bb43505e5e","Type":"ContainerDied","Data":"b51d821947355c45a982a5cfd0f06c341f0dbb600f2fdca3c963c983f65cfc2c"} Sep 30 13:53:12 crc kubenswrapper[4936]: I0930 13:53:12.542159 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwh6j" event={"ID":"a2866263-7d1e-47c3-8f8f-25bb43505e5e","Type":"ContainerStarted","Data":"95b9c79ae1551c262581571be1a8ec596992214019c645f36c9a8e514eab3850"} Sep 30 13:53:13 crc kubenswrapper[4936]: I0930 13:53:13.548951 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwh6j" event={"ID":"a2866263-7d1e-47c3-8f8f-25bb43505e5e","Type":"ContainerStarted","Data":"a2fba98cf427b008158fcc51b77e456c6b30ed1d08794f4d00319d3d0b7704e6"} Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.544684 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-x5g5r"] Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.545867 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-x5g5r" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.548446 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-5k994" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.552809 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-6c8cz"] Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.554067 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-6c8cz" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.557300 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-7nrtx" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.557628 4936 generic.go:334] "Generic (PLEG): container finished" podID="a2866263-7d1e-47c3-8f8f-25bb43505e5e" containerID="a2fba98cf427b008158fcc51b77e456c6b30ed1d08794f4d00319d3d0b7704e6" exitCode=0 Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.557655 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwh6j" event={"ID":"a2866263-7d1e-47c3-8f8f-25bb43505e5e","Type":"ContainerDied","Data":"a2fba98cf427b008158fcc51b77e456c6b30ed1d08794f4d00319d3d0b7704e6"} Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.568789 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-x5g5r"] Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.579438 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-6c8cz"] Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.600446 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-xz9zg"] Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.604127 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xz9zg" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.609425 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2zdj\" (UniqueName: \"kubernetes.io/projected/650ff8e9-279f-41ff-8bb8-1880e7cf985c-kube-api-access-v2zdj\") pod \"barbican-operator-controller-manager-6ff8b75857-x5g5r\" (UID: \"650ff8e9-279f-41ff-8bb8-1880e7cf985c\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-x5g5r" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.609484 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpf5f\" (UniqueName: \"kubernetes.io/projected/7cc732ee-78f8-4d20-aac8-67ab10b944d3-kube-api-access-cpf5f\") pod \"cinder-operator-controller-manager-644bddb6d8-6c8cz\" (UID: \"7cc732ee-78f8-4d20-aac8-67ab10b944d3\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-6c8cz" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.610603 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-d9bsz" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.647567 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-fk2zk"] Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.661029 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-xz9zg"] Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.661159 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fk2zk" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.668989 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-rtb7v" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.688473 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-fk2zk"] Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.695643 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-g5k7h"] Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.696927 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-g5k7h" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.708816 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-f88nn" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.711487 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpt57\" (UniqueName: \"kubernetes.io/projected/f632d83e-2c2c-4c90-8fea-5747d58633d6-kube-api-access-vpt57\") pod \"heat-operator-controller-manager-5d889d78cf-g5k7h\" (UID: \"f632d83e-2c2c-4c90-8fea-5747d58633d6\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-g5k7h" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.711536 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc9kl\" (UniqueName: \"kubernetes.io/projected/bd9be0ef-9048-4e0a-b8d7-1b29b450984f-kube-api-access-bc9kl\") pod \"glance-operator-controller-manager-84958c4d49-fk2zk\" (UID: \"bd9be0ef-9048-4e0a-b8d7-1b29b450984f\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fk2zk" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.711564 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2zdj\" (UniqueName: \"kubernetes.io/projected/650ff8e9-279f-41ff-8bb8-1880e7cf985c-kube-api-access-v2zdj\") pod \"barbican-operator-controller-manager-6ff8b75857-x5g5r\" (UID: \"650ff8e9-279f-41ff-8bb8-1880e7cf985c\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-x5g5r" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.711599 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6xjw\" (UniqueName: \"kubernetes.io/projected/9d8425ad-dcdc-4d31-9a5c-9461adb3296c-kube-api-access-v6xjw\") pod \"designate-operator-controller-manager-84f4f7b77b-xz9zg\" (UID: \"9d8425ad-dcdc-4d31-9a5c-9461adb3296c\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xz9zg" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.711620 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpf5f\" (UniqueName: \"kubernetes.io/projected/7cc732ee-78f8-4d20-aac8-67ab10b944d3-kube-api-access-cpf5f\") pod \"cinder-operator-controller-manager-644bddb6d8-6c8cz\" (UID: \"7cc732ee-78f8-4d20-aac8-67ab10b944d3\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-6c8cz" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.719493 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-g5k7h"] Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.735476 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz"] Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.736462 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.738513 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-9jjnj"] Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.739314 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9jjnj" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.745138 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-vd57p" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.745286 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.745455 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-77sv6" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.754848 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2zdj\" (UniqueName: \"kubernetes.io/projected/650ff8e9-279f-41ff-8bb8-1880e7cf985c-kube-api-access-v2zdj\") pod \"barbican-operator-controller-manager-6ff8b75857-x5g5r\" (UID: \"650ff8e9-279f-41ff-8bb8-1880e7cf985c\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-x5g5r" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.779558 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpf5f\" (UniqueName: \"kubernetes.io/projected/7cc732ee-78f8-4d20-aac8-67ab10b944d3-kube-api-access-cpf5f\") pod \"cinder-operator-controller-manager-644bddb6d8-6c8cz\" (UID: \"7cc732ee-78f8-4d20-aac8-67ab10b944d3\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-6c8cz" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.781225 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-n28ht"] Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.782852 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-n28ht" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.785705 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-hwzc6" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.812596 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzm2v\" (UniqueName: \"kubernetes.io/projected/7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6-kube-api-access-rzm2v\") pod \"infra-operator-controller-manager-7d857cc749-hwpcz\" (UID: \"7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.812650 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4w8h\" (UniqueName: \"kubernetes.io/projected/189d95a0-9ee0-4055-86c2-724082d46a11-kube-api-access-q4w8h\") pod \"ironic-operator-controller-manager-7975b88857-n28ht\" (UID: \"189d95a0-9ee0-4055-86c2-724082d46a11\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-n28ht" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.812669 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6-cert\") pod \"infra-operator-controller-manager-7d857cc749-hwpcz\" (UID: \"7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.812697 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpt57\" (UniqueName: \"kubernetes.io/projected/f632d83e-2c2c-4c90-8fea-5747d58633d6-kube-api-access-vpt57\") pod \"heat-operator-controller-manager-5d889d78cf-g5k7h\" (UID: \"f632d83e-2c2c-4c90-8fea-5747d58633d6\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-g5k7h" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.812721 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klckm\" (UniqueName: \"kubernetes.io/projected/13bd8563-ccb1-4445-b613-495e801195a4-kube-api-access-klckm\") pod \"horizon-operator-controller-manager-9f4696d94-9jjnj\" (UID: \"13bd8563-ccb1-4445-b613-495e801195a4\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9jjnj" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.812719 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-n28ht"] Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.812740 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc9kl\" (UniqueName: \"kubernetes.io/projected/bd9be0ef-9048-4e0a-b8d7-1b29b450984f-kube-api-access-bc9kl\") pod \"glance-operator-controller-manager-84958c4d49-fk2zk\" (UID: \"bd9be0ef-9048-4e0a-b8d7-1b29b450984f\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fk2zk" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.812914 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6xjw\" (UniqueName: \"kubernetes.io/projected/9d8425ad-dcdc-4d31-9a5c-9461adb3296c-kube-api-access-v6xjw\") pod \"designate-operator-controller-manager-84f4f7b77b-xz9zg\" (UID: \"9d8425ad-dcdc-4d31-9a5c-9461adb3296c\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xz9zg" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.833761 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz"] Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.844075 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-9jjnj"] Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.848488 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-sxftv"] Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.849699 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-sxftv" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.862252 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc9kl\" (UniqueName: \"kubernetes.io/projected/bd9be0ef-9048-4e0a-b8d7-1b29b450984f-kube-api-access-bc9kl\") pod \"glance-operator-controller-manager-84958c4d49-fk2zk\" (UID: \"bd9be0ef-9048-4e0a-b8d7-1b29b450984f\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fk2zk" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.862479 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-5dsgw" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.864906 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-hgdlx"] Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.865894 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-hgdlx" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.869953 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-x5g5r" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.871869 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpt57\" (UniqueName: \"kubernetes.io/projected/f632d83e-2c2c-4c90-8fea-5747d58633d6-kube-api-access-vpt57\") pod \"heat-operator-controller-manager-5d889d78cf-g5k7h\" (UID: \"f632d83e-2c2c-4c90-8fea-5747d58633d6\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-g5k7h" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.872130 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-jjvmb" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.890242 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6xjw\" (UniqueName: \"kubernetes.io/projected/9d8425ad-dcdc-4d31-9a5c-9461adb3296c-kube-api-access-v6xjw\") pod \"designate-operator-controller-manager-84f4f7b77b-xz9zg\" (UID: \"9d8425ad-dcdc-4d31-9a5c-9461adb3296c\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xz9zg" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.897055 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-6c8cz" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.914907 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klckm\" (UniqueName: \"kubernetes.io/projected/13bd8563-ccb1-4445-b613-495e801195a4-kube-api-access-klckm\") pod \"horizon-operator-controller-manager-9f4696d94-9jjnj\" (UID: \"13bd8563-ccb1-4445-b613-495e801195a4\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9jjnj" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.914964 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt7xr\" (UniqueName: \"kubernetes.io/projected/c54f70b2-5767-4616-878b-5816861d2637-kube-api-access-qt7xr\") pod \"keystone-operator-controller-manager-5bd55b4bff-hgdlx\" (UID: \"c54f70b2-5767-4616-878b-5816861d2637\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-hgdlx" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.915026 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbjc6\" (UniqueName: \"kubernetes.io/projected/d08bae3c-64f1-46de-ab2c-d6b2407c2d95-kube-api-access-vbjc6\") pod \"manila-operator-controller-manager-6d68dbc695-sxftv\" (UID: \"d08bae3c-64f1-46de-ab2c-d6b2407c2d95\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-sxftv" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.915060 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzm2v\" (UniqueName: \"kubernetes.io/projected/7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6-kube-api-access-rzm2v\") pod \"infra-operator-controller-manager-7d857cc749-hwpcz\" (UID: \"7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.915083 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4w8h\" (UniqueName: \"kubernetes.io/projected/189d95a0-9ee0-4055-86c2-724082d46a11-kube-api-access-q4w8h\") pod \"ironic-operator-controller-manager-7975b88857-n28ht\" (UID: \"189d95a0-9ee0-4055-86c2-724082d46a11\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-n28ht" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.915097 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6-cert\") pod \"infra-operator-controller-manager-7d857cc749-hwpcz\" (UID: \"7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" Sep 30 13:53:14 crc kubenswrapper[4936]: E0930 13:53:14.915208 4936 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Sep 30 13:53:14 crc kubenswrapper[4936]: E0930 13:53:14.915249 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6-cert podName:7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6 nodeName:}" failed. No retries permitted until 2025-09-30 13:53:15.415233768 +0000 UTC m=+845.799236069 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6-cert") pod "infra-operator-controller-manager-7d857cc749-hwpcz" (UID: "7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6") : secret "infra-operator-webhook-server-cert" not found Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.931028 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-8tpqq"] Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.932005 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8tpqq" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.938834 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-6xx9c" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.945159 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xz9zg" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.965927 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzm2v\" (UniqueName: \"kubernetes.io/projected/7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6-kube-api-access-rzm2v\") pod \"infra-operator-controller-manager-7d857cc749-hwpcz\" (UID: \"7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.978691 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klckm\" (UniqueName: \"kubernetes.io/projected/13bd8563-ccb1-4445-b613-495e801195a4-kube-api-access-klckm\") pod \"horizon-operator-controller-manager-9f4696d94-9jjnj\" (UID: \"13bd8563-ccb1-4445-b613-495e801195a4\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9jjnj" Sep 30 13:53:14 crc kubenswrapper[4936]: I0930 13:53:14.996176 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fk2zk" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.021275 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt7xr\" (UniqueName: \"kubernetes.io/projected/c54f70b2-5767-4616-878b-5816861d2637-kube-api-access-qt7xr\") pod \"keystone-operator-controller-manager-5bd55b4bff-hgdlx\" (UID: \"c54f70b2-5767-4616-878b-5816861d2637\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-hgdlx" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.021371 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf9np\" (UniqueName: \"kubernetes.io/projected/847b4871-2d23-4790-b32a-b42698008fee-kube-api-access-mf9np\") pod \"mariadb-operator-controller-manager-88c7-8tpqq\" (UID: \"847b4871-2d23-4790-b32a-b42698008fee\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-8tpqq" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.021530 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbjc6\" (UniqueName: \"kubernetes.io/projected/d08bae3c-64f1-46de-ab2c-d6b2407c2d95-kube-api-access-vbjc6\") pod \"manila-operator-controller-manager-6d68dbc695-sxftv\" (UID: \"d08bae3c-64f1-46de-ab2c-d6b2407c2d95\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-sxftv" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.022219 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-2kldb"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.030989 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-g5k7h" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.041690 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4w8h\" (UniqueName: \"kubernetes.io/projected/189d95a0-9ee0-4055-86c2-724082d46a11-kube-api-access-q4w8h\") pod \"ironic-operator-controller-manager-7975b88857-n28ht\" (UID: \"189d95a0-9ee0-4055-86c2-724082d46a11\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-n28ht" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.048165 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-zvzfj"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.049740 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kldb" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.053295 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-hgdlx"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.053468 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-zvzfj" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.053954 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-7b7kr" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.060545 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbjc6\" (UniqueName: \"kubernetes.io/projected/d08bae3c-64f1-46de-ab2c-d6b2407c2d95-kube-api-access-vbjc6\") pod \"manila-operator-controller-manager-6d68dbc695-sxftv\" (UID: \"d08bae3c-64f1-46de-ab2c-d6b2407c2d95\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-sxftv" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.063041 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt7xr\" (UniqueName: \"kubernetes.io/projected/c54f70b2-5767-4616-878b-5816861d2637-kube-api-access-qt7xr\") pod \"keystone-operator-controller-manager-5bd55b4bff-hgdlx\" (UID: \"c54f70b2-5767-4616-878b-5816861d2637\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-hgdlx" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.067546 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-hgx4h" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.083460 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-sxftv"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.094461 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-8tpqq"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.125056 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-zvzfj"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.132948 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bw84\" (UniqueName: \"kubernetes.io/projected/e546e1bb-9ee4-4549-9521-76d122b4edf5-kube-api-access-8bw84\") pod \"neutron-operator-controller-manager-64d7b59854-2kldb\" (UID: \"e546e1bb-9ee4-4549-9521-76d122b4edf5\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kldb" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.133012 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt49z\" (UniqueName: \"kubernetes.io/projected/5ec29297-f0db-497d-aa05-e939e9aef380-kube-api-access-rt49z\") pod \"nova-operator-controller-manager-c7c776c96-zvzfj\" (UID: \"5ec29297-f0db-497d-aa05-e939e9aef380\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-zvzfj" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.133149 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf9np\" (UniqueName: \"kubernetes.io/projected/847b4871-2d23-4790-b32a-b42698008fee-kube-api-access-mf9np\") pod \"mariadb-operator-controller-manager-88c7-8tpqq\" (UID: \"847b4871-2d23-4790-b32a-b42698008fee\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-8tpqq" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.133651 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9jjnj" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.134032 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-2kldb"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.150765 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-n28ht" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.171509 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-lhbld"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.172563 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-lhbld" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.186155 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-tz5d4" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.190795 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf9np\" (UniqueName: \"kubernetes.io/projected/847b4871-2d23-4790-b32a-b42698008fee-kube-api-access-mf9np\") pod \"mariadb-operator-controller-manager-88c7-8tpqq\" (UID: \"847b4871-2d23-4790-b32a-b42698008fee\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-8tpqq" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.194656 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-lhbld"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.234960 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bw84\" (UniqueName: \"kubernetes.io/projected/e546e1bb-9ee4-4549-9521-76d122b4edf5-kube-api-access-8bw84\") pod \"neutron-operator-controller-manager-64d7b59854-2kldb\" (UID: \"e546e1bb-9ee4-4549-9521-76d122b4edf5\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kldb" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.235022 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt49z\" (UniqueName: \"kubernetes.io/projected/5ec29297-f0db-497d-aa05-e939e9aef380-kube-api-access-rt49z\") pod \"nova-operator-controller-manager-c7c776c96-zvzfj\" (UID: \"5ec29297-f0db-497d-aa05-e939e9aef380\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-zvzfj" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.235060 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xgcr\" (UniqueName: \"kubernetes.io/projected/d50d2534-deec-4173-a73a-d10b3beac452-kube-api-access-8xgcr\") pod \"octavia-operator-controller-manager-76fcc6dc7c-lhbld\" (UID: \"d50d2534-deec-4173-a73a-d10b3beac452\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-lhbld" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.240833 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-sthwj"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.242317 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-sthwj" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.248171 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-mbtzw" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.261078 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.263409 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.269288 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-sthwj"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.272685 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-hlcnv" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.284040 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.291778 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-sxftv" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.301481 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt49z\" (UniqueName: \"kubernetes.io/projected/5ec29297-f0db-497d-aa05-e939e9aef380-kube-api-access-rt49z\") pod \"nova-operator-controller-manager-c7c776c96-zvzfj\" (UID: \"5ec29297-f0db-497d-aa05-e939e9aef380\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-zvzfj" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.336422 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-sxfzz"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.337971 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-sxfzz" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.342814 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-5qglc" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.346990 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-hgdlx" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.348354 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xgcr\" (UniqueName: \"kubernetes.io/projected/d50d2534-deec-4173-a73a-d10b3beac452-kube-api-access-8xgcr\") pod \"octavia-operator-controller-manager-76fcc6dc7c-lhbld\" (UID: \"d50d2534-deec-4173-a73a-d10b3beac452\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-lhbld" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.348570 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2xf4\" (UniqueName: \"kubernetes.io/projected/85448d25-86e9-4a2f-bc5a-339ab3d2112a-kube-api-access-n2xf4\") pod \"ovn-operator-controller-manager-9976ff44c-sthwj\" (UID: \"85448d25-86e9-4a2f-bc5a-339ab3d2112a\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-sthwj" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.348712 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1538e13-4b0e-4bb9-9277-3d0475cd41a4-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-5mdhl\" (UID: \"b1538e13-4b0e-4bb9-9277-3d0475cd41a4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.348803 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2hcb\" (UniqueName: \"kubernetes.io/projected/b1538e13-4b0e-4bb9-9277-3d0475cd41a4-kube-api-access-l2hcb\") pod \"openstack-baremetal-operator-controller-manager-6d776955-5mdhl\" (UID: \"b1538e13-4b0e-4bb9-9277-3d0475cd41a4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.352235 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-n6mv9"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.353515 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-n6mv9" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.371077 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8tpqq" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.371927 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-md6n2" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.385713 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bw84\" (UniqueName: \"kubernetes.io/projected/e546e1bb-9ee4-4549-9521-76d122b4edf5-kube-api-access-8bw84\") pod \"neutron-operator-controller-manager-64d7b59854-2kldb\" (UID: \"e546e1bb-9ee4-4549-9521-76d122b4edf5\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kldb" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.399149 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xgcr\" (UniqueName: \"kubernetes.io/projected/d50d2534-deec-4173-a73a-d10b3beac452-kube-api-access-8xgcr\") pod \"octavia-operator-controller-manager-76fcc6dc7c-lhbld\" (UID: \"d50d2534-deec-4173-a73a-d10b3beac452\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-lhbld" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.399845 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kldb" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.408999 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.429778 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-zvzfj" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.444261 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-sxfzz"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.451236 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6-cert\") pod \"infra-operator-controller-manager-7d857cc749-hwpcz\" (UID: \"7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.451324 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2xf4\" (UniqueName: \"kubernetes.io/projected/85448d25-86e9-4a2f-bc5a-339ab3d2112a-kube-api-access-n2xf4\") pod \"ovn-operator-controller-manager-9976ff44c-sthwj\" (UID: \"85448d25-86e9-4a2f-bc5a-339ab3d2112a\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-sthwj" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.451384 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1538e13-4b0e-4bb9-9277-3d0475cd41a4-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-5mdhl\" (UID: \"b1538e13-4b0e-4bb9-9277-3d0475cd41a4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.451403 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2hcb\" (UniqueName: \"kubernetes.io/projected/b1538e13-4b0e-4bb9-9277-3d0475cd41a4-kube-api-access-l2hcb\") pod \"openstack-baremetal-operator-controller-manager-6d776955-5mdhl\" (UID: \"b1538e13-4b0e-4bb9-9277-3d0475cd41a4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" Sep 30 13:53:15 crc kubenswrapper[4936]: E0930 13:53:15.452307 4936 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Sep 30 13:53:15 crc kubenswrapper[4936]: E0930 13:53:15.452368 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6-cert podName:7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6 nodeName:}" failed. No retries permitted until 2025-09-30 13:53:16.45235402 +0000 UTC m=+846.836356321 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6-cert") pod "infra-operator-controller-manager-7d857cc749-hwpcz" (UID: "7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6") : secret "infra-operator-webhook-server-cert" not found Sep 30 13:53:15 crc kubenswrapper[4936]: E0930 13:53:15.452839 4936 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 13:53:15 crc kubenswrapper[4936]: E0930 13:53:15.452866 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1538e13-4b0e-4bb9-9277-3d0475cd41a4-cert podName:b1538e13-4b0e-4bb9-9277-3d0475cd41a4 nodeName:}" failed. No retries permitted until 2025-09-30 13:53:15.952858144 +0000 UTC m=+846.336860445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b1538e13-4b0e-4bb9-9277-3d0475cd41a4-cert") pod "openstack-baremetal-operator-controller-manager-6d776955-5mdhl" (UID: "b1538e13-4b0e-4bb9-9277-3d0475cd41a4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.460397 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-n6mv9"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.482369 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-7rfmc"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.483466 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-7rfmc" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.496756 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-p85vw" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.498529 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2xf4\" (UniqueName: \"kubernetes.io/projected/85448d25-86e9-4a2f-bc5a-339ab3d2112a-kube-api-access-n2xf4\") pod \"ovn-operator-controller-manager-9976ff44c-sthwj\" (UID: \"85448d25-86e9-4a2f-bc5a-339ab3d2112a\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-sthwj" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.512162 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2hcb\" (UniqueName: \"kubernetes.io/projected/b1538e13-4b0e-4bb9-9277-3d0475cd41a4-kube-api-access-l2hcb\") pod \"openstack-baremetal-operator-controller-manager-6d776955-5mdhl\" (UID: \"b1538e13-4b0e-4bb9-9277-3d0475cd41a4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.516721 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-lhbld" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.531808 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-x2csq"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.532899 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-x2csq" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.551084 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-4csrl" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.552155 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7xxv\" (UniqueName: \"kubernetes.io/projected/699f5243-7ea5-4f7f-a537-51a99a871ccb-kube-api-access-s7xxv\") pod \"placement-operator-controller-manager-589c58c6c-sxfzz\" (UID: \"699f5243-7ea5-4f7f-a537-51a99a871ccb\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-sxfzz" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.552204 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fzmz\" (UniqueName: \"kubernetes.io/projected/bbbf4ed1-241b-4c4e-80e0-77acd778b868-kube-api-access-7fzmz\") pod \"swift-operator-controller-manager-bc7dc7bd9-n6mv9\" (UID: \"bbbf4ed1-241b-4c4e-80e0-77acd778b868\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-n6mv9" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.566839 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-7rfmc"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.574111 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-sthwj" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.583143 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-x2csq"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.603762 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-d2tm5"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.612470 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-d2tm5" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.637098 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-d2tm5"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.654018 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7xxv\" (UniqueName: \"kubernetes.io/projected/699f5243-7ea5-4f7f-a537-51a99a871ccb-kube-api-access-s7xxv\") pod \"placement-operator-controller-manager-589c58c6c-sxfzz\" (UID: \"699f5243-7ea5-4f7f-a537-51a99a871ccb\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-sxfzz" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.654057 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mxj2\" (UniqueName: \"kubernetes.io/projected/78f55939-d8fc-40d5-bc8e-a3f87b962b34-kube-api-access-7mxj2\") pod \"telemetry-operator-controller-manager-b8d54b5d7-7rfmc\" (UID: \"78f55939-d8fc-40d5-bc8e-a3f87b962b34\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-7rfmc" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.654103 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fzmz\" (UniqueName: \"kubernetes.io/projected/bbbf4ed1-241b-4c4e-80e0-77acd778b868-kube-api-access-7fzmz\") pod \"swift-operator-controller-manager-bc7dc7bd9-n6mv9\" (UID: \"bbbf4ed1-241b-4c4e-80e0-77acd778b868\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-n6mv9" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.654159 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fhp4\" (UniqueName: \"kubernetes.io/projected/673847ae-740d-4a3b-ad7e-09ec8848199d-kube-api-access-5fhp4\") pod \"test-operator-controller-manager-f66b554c6-x2csq\" (UID: \"673847ae-740d-4a3b-ad7e-09ec8848199d\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-x2csq" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.664969 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-sjzqm" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.691476 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fzmz\" (UniqueName: \"kubernetes.io/projected/bbbf4ed1-241b-4c4e-80e0-77acd778b868-kube-api-access-7fzmz\") pod \"swift-operator-controller-manager-bc7dc7bd9-n6mv9\" (UID: \"bbbf4ed1-241b-4c4e-80e0-77acd778b868\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-n6mv9" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.698366 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7xxv\" (UniqueName: \"kubernetes.io/projected/699f5243-7ea5-4f7f-a537-51a99a871ccb-kube-api-access-s7xxv\") pod \"placement-operator-controller-manager-589c58c6c-sxfzz\" (UID: \"699f5243-7ea5-4f7f-a537-51a99a871ccb\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-sxfzz" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.724767 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-sxfzz" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.757203 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mxj2\" (UniqueName: \"kubernetes.io/projected/78f55939-d8fc-40d5-bc8e-a3f87b962b34-kube-api-access-7mxj2\") pod \"telemetry-operator-controller-manager-b8d54b5d7-7rfmc\" (UID: \"78f55939-d8fc-40d5-bc8e-a3f87b962b34\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-7rfmc" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.757292 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fhp4\" (UniqueName: \"kubernetes.io/projected/673847ae-740d-4a3b-ad7e-09ec8848199d-kube-api-access-5fhp4\") pod \"test-operator-controller-manager-f66b554c6-x2csq\" (UID: \"673847ae-740d-4a3b-ad7e-09ec8848199d\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-x2csq" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.757321 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xwt5\" (UniqueName: \"kubernetes.io/projected/639a60da-010a-40d5-bfec-6219ef3f712b-kube-api-access-8xwt5\") pod \"watcher-operator-controller-manager-76669f99c-d2tm5\" (UID: \"639a60da-010a-40d5-bfec-6219ef3f712b\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-d2tm5" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.767685 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-n6mv9" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.792580 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fhp4\" (UniqueName: \"kubernetes.io/projected/673847ae-740d-4a3b-ad7e-09ec8848199d-kube-api-access-5fhp4\") pod \"test-operator-controller-manager-f66b554c6-x2csq\" (UID: \"673847ae-740d-4a3b-ad7e-09ec8848199d\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-x2csq" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.820269 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mxj2\" (UniqueName: \"kubernetes.io/projected/78f55939-d8fc-40d5-bc8e-a3f87b962b34-kube-api-access-7mxj2\") pod \"telemetry-operator-controller-manager-b8d54b5d7-7rfmc\" (UID: \"78f55939-d8fc-40d5-bc8e-a3f87b962b34\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-7rfmc" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.871619 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-7rfmc" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.882358 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xwt5\" (UniqueName: \"kubernetes.io/projected/639a60da-010a-40d5-bfec-6219ef3f712b-kube-api-access-8xwt5\") pod \"watcher-operator-controller-manager-76669f99c-d2tm5\" (UID: \"639a60da-010a-40d5-bfec-6219ef3f712b\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-d2tm5" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.902029 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-x2csq" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.903007 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-64c94699b9-lqnhn"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.905503 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-64c94699b9-lqnhn" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.920786 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-jfgfk" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.921092 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.952154 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xwt5\" (UniqueName: \"kubernetes.io/projected/639a60da-010a-40d5-bfec-6219ef3f712b-kube-api-access-8xwt5\") pod \"watcher-operator-controller-manager-76669f99c-d2tm5\" (UID: \"639a60da-010a-40d5-bfec-6219ef3f712b\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-d2tm5" Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.953481 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-64c94699b9-lqnhn"] Sep 30 13:53:15 crc kubenswrapper[4936]: I0930 13:53:15.971174 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-d2tm5" Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.018877 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4e378e0-0a69-47c9-b80f-fee159c4ad5b-cert\") pod \"openstack-operator-controller-manager-64c94699b9-lqnhn\" (UID: \"b4e378e0-0a69-47c9-b80f-fee159c4ad5b\") " pod="openstack-operators/openstack-operator-controller-manager-64c94699b9-lqnhn" Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.019252 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1538e13-4b0e-4bb9-9277-3d0475cd41a4-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-5mdhl\" (UID: \"b1538e13-4b0e-4bb9-9277-3d0475cd41a4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.019298 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72tjk\" (UniqueName: \"kubernetes.io/projected/b4e378e0-0a69-47c9-b80f-fee159c4ad5b-kube-api-access-72tjk\") pod \"openstack-operator-controller-manager-64c94699b9-lqnhn\" (UID: \"b4e378e0-0a69-47c9-b80f-fee159c4ad5b\") " pod="openstack-operators/openstack-operator-controller-manager-64c94699b9-lqnhn" Sep 30 13:53:16 crc kubenswrapper[4936]: E0930 13:53:16.019464 4936 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 13:53:16 crc kubenswrapper[4936]: E0930 13:53:16.019504 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1538e13-4b0e-4bb9-9277-3d0475cd41a4-cert podName:b1538e13-4b0e-4bb9-9277-3d0475cd41a4 nodeName:}" failed. No retries permitted until 2025-09-30 13:53:17.019489882 +0000 UTC m=+847.403492183 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b1538e13-4b0e-4bb9-9277-3d0475cd41a4-cert") pod "openstack-baremetal-operator-controller-manager-6d776955-5mdhl" (UID: "b1538e13-4b0e-4bb9-9277-3d0475cd41a4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.092478 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-49bq2"] Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.109094 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-49bq2" Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.112743 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-ktzl7" Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.121384 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72tjk\" (UniqueName: \"kubernetes.io/projected/b4e378e0-0a69-47c9-b80f-fee159c4ad5b-kube-api-access-72tjk\") pod \"openstack-operator-controller-manager-64c94699b9-lqnhn\" (UID: \"b4e378e0-0a69-47c9-b80f-fee159c4ad5b\") " pod="openstack-operators/openstack-operator-controller-manager-64c94699b9-lqnhn" Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.121487 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvg2c\" (UniqueName: \"kubernetes.io/projected/d93cfe4f-caf4-4b23-9d9c-0aa14cb5bc28-kube-api-access-xvg2c\") pod \"rabbitmq-cluster-operator-manager-79d8469568-49bq2\" (UID: \"d93cfe4f-caf4-4b23-9d9c-0aa14cb5bc28\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-49bq2" Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.121512 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4e378e0-0a69-47c9-b80f-fee159c4ad5b-cert\") pod \"openstack-operator-controller-manager-64c94699b9-lqnhn\" (UID: \"b4e378e0-0a69-47c9-b80f-fee159c4ad5b\") " pod="openstack-operators/openstack-operator-controller-manager-64c94699b9-lqnhn" Sep 30 13:53:16 crc kubenswrapper[4936]: E0930 13:53:16.121642 4936 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 30 13:53:16 crc kubenswrapper[4936]: E0930 13:53:16.121692 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4e378e0-0a69-47c9-b80f-fee159c4ad5b-cert podName:b4e378e0-0a69-47c9-b80f-fee159c4ad5b nodeName:}" failed. No retries permitted until 2025-09-30 13:53:16.621673214 +0000 UTC m=+847.005675515 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4e378e0-0a69-47c9-b80f-fee159c4ad5b-cert") pod "openstack-operator-controller-manager-64c94699b9-lqnhn" (UID: "b4e378e0-0a69-47c9-b80f-fee159c4ad5b") : secret "webhook-server-cert" not found Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.127608 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-49bq2"] Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.166303 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72tjk\" (UniqueName: \"kubernetes.io/projected/b4e378e0-0a69-47c9-b80f-fee159c4ad5b-kube-api-access-72tjk\") pod \"openstack-operator-controller-manager-64c94699b9-lqnhn\" (UID: \"b4e378e0-0a69-47c9-b80f-fee159c4ad5b\") " pod="openstack-operators/openstack-operator-controller-manager-64c94699b9-lqnhn" Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.204066 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-x5g5r"] Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.214041 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-n28ht"] Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.222849 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvg2c\" (UniqueName: \"kubernetes.io/projected/d93cfe4f-caf4-4b23-9d9c-0aa14cb5bc28-kube-api-access-xvg2c\") pod \"rabbitmq-cluster-operator-manager-79d8469568-49bq2\" (UID: \"d93cfe4f-caf4-4b23-9d9c-0aa14cb5bc28\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-49bq2" Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.260476 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvg2c\" (UniqueName: \"kubernetes.io/projected/d93cfe4f-caf4-4b23-9d9c-0aa14cb5bc28-kube-api-access-xvg2c\") pod \"rabbitmq-cluster-operator-manager-79d8469568-49bq2\" (UID: \"d93cfe4f-caf4-4b23-9d9c-0aa14cb5bc28\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-49bq2" Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.526132 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6-cert\") pod \"infra-operator-controller-manager-7d857cc749-hwpcz\" (UID: \"7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.541554 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6-cert\") pod \"infra-operator-controller-manager-7d857cc749-hwpcz\" (UID: \"7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.549933 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-49bq2" Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.582369 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.617256 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-x5g5r" event={"ID":"650ff8e9-279f-41ff-8bb8-1880e7cf985c","Type":"ContainerStarted","Data":"e1e2e57bbf3d691aa19b2e641edc9699ad7ad240c26994020298be35b888269b"} Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.623675 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-n28ht" event={"ID":"189d95a0-9ee0-4055-86c2-724082d46a11","Type":"ContainerStarted","Data":"d33a55ffb71de829efa4a990f0316a7a903fedbee6c7e1470b22b1dc1654a4d4"} Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.627839 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwh6j" event={"ID":"a2866263-7d1e-47c3-8f8f-25bb43505e5e","Type":"ContainerStarted","Data":"a228f6c3800cc8ba4db4e093aef27a41717ea0092ac07af8187408ce9a6cf272"} Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.629136 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4e378e0-0a69-47c9-b80f-fee159c4ad5b-cert\") pod \"openstack-operator-controller-manager-64c94699b9-lqnhn\" (UID: \"b4e378e0-0a69-47c9-b80f-fee159c4ad5b\") " pod="openstack-operators/openstack-operator-controller-manager-64c94699b9-lqnhn" Sep 30 13:53:16 crc kubenswrapper[4936]: E0930 13:53:16.629388 4936 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 30 13:53:16 crc kubenswrapper[4936]: E0930 13:53:16.629441 4936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4e378e0-0a69-47c9-b80f-fee159c4ad5b-cert podName:b4e378e0-0a69-47c9-b80f-fee159c4ad5b nodeName:}" failed. No retries permitted until 2025-09-30 13:53:17.629425041 +0000 UTC m=+848.013427332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4e378e0-0a69-47c9-b80f-fee159c4ad5b-cert") pod "openstack-operator-controller-manager-64c94699b9-lqnhn" (UID: "b4e378e0-0a69-47c9-b80f-fee159c4ad5b") : secret "webhook-server-cert" not found Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.710369 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fwh6j" podStartSLOduration=2.703719359 podStartE2EDuration="5.710351929s" podCreationTimestamp="2025-09-30 13:53:11 +0000 UTC" firstStartedPulling="2025-09-30 13:53:12.543507006 +0000 UTC m=+842.927509307" lastFinishedPulling="2025-09-30 13:53:15.550139586 +0000 UTC m=+845.934141877" observedRunningTime="2025-09-30 13:53:16.687657915 +0000 UTC m=+847.071660216" watchObservedRunningTime="2025-09-30 13:53:16.710351929 +0000 UTC m=+847.094354220" Sep 30 13:53:16 crc kubenswrapper[4936]: I0930 13:53:16.768712 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-6c8cz"] Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.026099 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-xz9zg"] Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.050945 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-9jjnj"] Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.053505 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1538e13-4b0e-4bb9-9277-3d0475cd41a4-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-5mdhl\" (UID: \"b1538e13-4b0e-4bb9-9277-3d0475cd41a4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.062703 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1538e13-4b0e-4bb9-9277-3d0475cd41a4-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-5mdhl\" (UID: \"b1538e13-4b0e-4bb9-9277-3d0475cd41a4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.078250 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-fk2zk"] Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.121513 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.213302 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-sxftv"] Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.218767 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-zvzfj"] Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.258151 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-g5k7h"] Sep 30 13:53:17 crc kubenswrapper[4936]: W0930 13:53:17.265652 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd08bae3c_64f1_46de_ab2c_d6b2407c2d95.slice/crio-72eb7c298ffde3e5c78d29393a19afd4c86c1addd05d4395002524debd469bff WatchSource:0}: Error finding container 72eb7c298ffde3e5c78d29393a19afd4c86c1addd05d4395002524debd469bff: Status 404 returned error can't find the container with id 72eb7c298ffde3e5c78d29393a19afd4c86c1addd05d4395002524debd469bff Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.284072 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-hgdlx"] Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.655185 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xz9zg" event={"ID":"9d8425ad-dcdc-4d31-9a5c-9461adb3296c","Type":"ContainerStarted","Data":"ecd63615388e25deb9dd129b65ae03dedccbe26be975967ef3b7021a724d202b"} Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.656559 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-g5k7h" event={"ID":"f632d83e-2c2c-4c90-8fea-5747d58633d6","Type":"ContainerStarted","Data":"6987df6804e093ce427bf8fe23dab084fc3778b0e903b79be0170ac1f6a5533d"} Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.658850 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fk2zk" event={"ID":"bd9be0ef-9048-4e0a-b8d7-1b29b450984f","Type":"ContainerStarted","Data":"7f88421403661642ddf5971585108b91c2721cf6da471a9776f14a5bd58d6b37"} Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.662757 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4e378e0-0a69-47c9-b80f-fee159c4ad5b-cert\") pod \"openstack-operator-controller-manager-64c94699b9-lqnhn\" (UID: \"b4e378e0-0a69-47c9-b80f-fee159c4ad5b\") " pod="openstack-operators/openstack-operator-controller-manager-64c94699b9-lqnhn" Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.663603 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-6c8cz" event={"ID":"7cc732ee-78f8-4d20-aac8-67ab10b944d3","Type":"ContainerStarted","Data":"eff27aff979bc021db5f1ee800c6617e00a0bbaf54ce6d585c4af8c918c71294"} Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.664925 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9jjnj" event={"ID":"13bd8563-ccb1-4445-b613-495e801195a4","Type":"ContainerStarted","Data":"4038747773ab1bc4bc7b1da95f055f47f11a176ee89c49b9bb0fa62bb7c5ac4a"} Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.667841 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4e378e0-0a69-47c9-b80f-fee159c4ad5b-cert\") pod \"openstack-operator-controller-manager-64c94699b9-lqnhn\" (UID: \"b4e378e0-0a69-47c9-b80f-fee159c4ad5b\") " pod="openstack-operators/openstack-operator-controller-manager-64c94699b9-lqnhn" Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.669446 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-hgdlx" event={"ID":"c54f70b2-5767-4616-878b-5816861d2637","Type":"ContainerStarted","Data":"507a9ed83189e73323efe89a9053f613300c4e4f3e1bc840963743b453e8413b"} Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.670623 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-zvzfj" event={"ID":"5ec29297-f0db-497d-aa05-e939e9aef380","Type":"ContainerStarted","Data":"c2263f013be903179881c01683c136f1279d9457ee6f419a0e9a6864bc1380ce"} Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.673080 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-sxftv" event={"ID":"d08bae3c-64f1-46de-ab2c-d6b2407c2d95","Type":"ContainerStarted","Data":"72eb7c298ffde3e5c78d29393a19afd4c86c1addd05d4395002524debd469bff"} Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.694010 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-sxfzz"] Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.705067 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-2kldb"] Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.718106 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-7rfmc"] Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.732610 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-n6mv9"] Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.737071 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-sthwj"] Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.742132 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-d2tm5"] Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.758279 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-8tpqq"] Sep 30 13:53:17 crc kubenswrapper[4936]: E0930 13:53:17.821672 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8xwt5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-76669f99c-d2tm5_openstack-operators(639a60da-010a-40d5-bfec-6219ef3f712b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.833701 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-64c94699b9-lqnhn" Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.857173 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-lhbld"] Sep 30 13:53:17 crc kubenswrapper[4936]: I0930 13:53:17.987530 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-x2csq"] Sep 30 13:53:18 crc kubenswrapper[4936]: W0930 13:53:18.011992 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod673847ae_740d_4a3b_ad7e_09ec8848199d.slice/crio-4f248fbf291389fbb62d003a438670cd3672be46c0f0648dfff61b0fb628fc8c WatchSource:0}: Error finding container 4f248fbf291389fbb62d003a438670cd3672be46c0f0648dfff61b0fb628fc8c: Status 404 returned error can't find the container with id 4f248fbf291389fbb62d003a438670cd3672be46c0f0648dfff61b0fb628fc8c Sep 30 13:53:18 crc kubenswrapper[4936]: I0930 13:53:18.027058 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz"] Sep 30 13:53:18 crc kubenswrapper[4936]: E0930 13:53:18.054027 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rzm2v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-7d857cc749-hwpcz_openstack-operators(7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 13:53:18 crc kubenswrapper[4936]: I0930 13:53:18.086176 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-49bq2"] Sep 30 13:53:18 crc kubenswrapper[4936]: I0930 13:53:18.242686 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl"] Sep 30 13:53:18 crc kubenswrapper[4936]: W0930 13:53:18.391737 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1538e13_4b0e_4bb9_9277_3d0475cd41a4.slice/crio-04a33c7ac9f455bbe11c2e9a87da9f0ad17a5d13154170f0088593d00e82ec9a WatchSource:0}: Error finding container 04a33c7ac9f455bbe11c2e9a87da9f0ad17a5d13154170f0088593d00e82ec9a: Status 404 returned error can't find the container with id 04a33c7ac9f455bbe11c2e9a87da9f0ad17a5d13154170f0088593d00e82ec9a Sep 30 13:53:18 crc kubenswrapper[4936]: E0930 13:53:18.411308 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_LIGHTSPEED_IMAGE_URL_DEFAULT,Value:quay.io/openstack-lightspeed/rag-content:os-docs-2024.2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l2hcb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-6d776955-5mdhl_openstack-operators(b1538e13-4b0e-4bb9-9277-3d0475cd41a4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 13:53:18 crc kubenswrapper[4936]: E0930 13:53:18.555415 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-d2tm5" podUID="639a60da-010a-40d5-bfec-6219ef3f712b" Sep 30 13:53:18 crc kubenswrapper[4936]: I0930 13:53:18.715964 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8tpqq" event={"ID":"847b4871-2d23-4790-b32a-b42698008fee","Type":"ContainerStarted","Data":"0ac114f3a60acadd9de4e481485aa75dbcfe6a7283fc816be00c60a96ab20f20"} Sep 30 13:53:18 crc kubenswrapper[4936]: I0930 13:53:18.747636 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" event={"ID":"b1538e13-4b0e-4bb9-9277-3d0475cd41a4","Type":"ContainerStarted","Data":"04a33c7ac9f455bbe11c2e9a87da9f0ad17a5d13154170f0088593d00e82ec9a"} Sep 30 13:53:18 crc kubenswrapper[4936]: I0930 13:53:18.749725 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-d2tm5" event={"ID":"639a60da-010a-40d5-bfec-6219ef3f712b","Type":"ContainerStarted","Data":"f9f65ee6dbc8969b4afb97faa32c5989cc73d25b2ea0c1454a50ee81f5ec7a33"} Sep 30 13:53:18 crc kubenswrapper[4936]: I0930 13:53:18.749789 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-d2tm5" event={"ID":"639a60da-010a-40d5-bfec-6219ef3f712b","Type":"ContainerStarted","Data":"0633c9062b8f8072d92def654a08a4c8256ffe7a85b1a6c321860b0625588065"} Sep 30 13:53:18 crc kubenswrapper[4936]: I0930 13:53:18.756255 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kldb" event={"ID":"e546e1bb-9ee4-4549-9521-76d122b4edf5","Type":"ContainerStarted","Data":"4160deedabf6d9ee35f6837d2eb129279b61d2f09fb73bac201525cbf7db0ea1"} Sep 30 13:53:18 crc kubenswrapper[4936]: I0930 13:53:18.763890 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-n6mv9" event={"ID":"bbbf4ed1-241b-4c4e-80e0-77acd778b868","Type":"ContainerStarted","Data":"8c955728e083848ef378822f42394d2ec0daf0d932bc2274e419a748098aa053"} Sep 30 13:53:18 crc kubenswrapper[4936]: E0930 13:53:18.771511 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-d2tm5" podUID="639a60da-010a-40d5-bfec-6219ef3f712b" Sep 30 13:53:18 crc kubenswrapper[4936]: I0930 13:53:18.772849 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-sthwj" event={"ID":"85448d25-86e9-4a2f-bc5a-339ab3d2112a","Type":"ContainerStarted","Data":"ab77aa498bdcd6aad7cd7942508bf3d20e9ea80261a5971466b1b44caba9a7f0"} Sep 30 13:53:18 crc kubenswrapper[4936]: E0930 13:53:18.785275 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" podUID="7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6" Sep 30 13:53:18 crc kubenswrapper[4936]: I0930 13:53:18.785484 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" event={"ID":"7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6","Type":"ContainerStarted","Data":"7f1b0fd2a63df90960a5f20a0a9eeee6d42fb2f07fcb84cc21b5a90423bf8c4e"} Sep 30 13:53:18 crc kubenswrapper[4936]: I0930 13:53:18.785529 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" event={"ID":"7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6","Type":"ContainerStarted","Data":"dae4a2f2a33830acf31c2daaef2611aaf372a33883c07196b9f1d4ab90f0944e"} Sep 30 13:53:18 crc kubenswrapper[4936]: I0930 13:53:18.796867 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-7rfmc" event={"ID":"78f55939-d8fc-40d5-bc8e-a3f87b962b34","Type":"ContainerStarted","Data":"fddc1ae373af87677f1a82fd74b13a1457de7849752acc542dd1d56d97ad7a9f"} Sep 30 13:53:18 crc kubenswrapper[4936]: I0930 13:53:18.815909 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-49bq2" event={"ID":"d93cfe4f-caf4-4b23-9d9c-0aa14cb5bc28","Type":"ContainerStarted","Data":"d5c54831a6d1dedcaaa88051fbf9feddb93259ff3cdb315cc33dc501302394e8"} Sep 30 13:53:18 crc kubenswrapper[4936]: I0930 13:53:18.829284 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-64c94699b9-lqnhn"] Sep 30 13:53:18 crc kubenswrapper[4936]: I0930 13:53:18.832100 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-lhbld" event={"ID":"d50d2534-deec-4173-a73a-d10b3beac452","Type":"ContainerStarted","Data":"1d9bdf65835f8f9eb817104e1f2ceabf03b585432984a7bac975f00025724c4d"} Sep 30 13:53:18 crc kubenswrapper[4936]: E0930 13:53:18.833221 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" podUID="b1538e13-4b0e-4bb9-9277-3d0475cd41a4" Sep 30 13:53:18 crc kubenswrapper[4936]: I0930 13:53:18.846152 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-x2csq" event={"ID":"673847ae-740d-4a3b-ad7e-09ec8848199d","Type":"ContainerStarted","Data":"4f248fbf291389fbb62d003a438670cd3672be46c0f0648dfff61b0fb628fc8c"} Sep 30 13:53:18 crc kubenswrapper[4936]: I0930 13:53:18.851207 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-sxfzz" event={"ID":"699f5243-7ea5-4f7f-a537-51a99a871ccb","Type":"ContainerStarted","Data":"e43641f4771cbe8fc184d64706e9913548cacb78cd46dd8752afc57118e22281"} Sep 30 13:53:18 crc kubenswrapper[4936]: W0930 13:53:18.903934 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4e378e0_0a69_47c9_b80f_fee159c4ad5b.slice/crio-4ba7eeb8a34f22a2ba0d88c22a9f5204adaef1c9a4049b7d67a64015fe577cc9 WatchSource:0}: Error finding container 4ba7eeb8a34f22a2ba0d88c22a9f5204adaef1c9a4049b7d67a64015fe577cc9: Status 404 returned error can't find the container with id 4ba7eeb8a34f22a2ba0d88c22a9f5204adaef1c9a4049b7d67a64015fe577cc9 Sep 30 13:53:19 crc kubenswrapper[4936]: I0930 13:53:19.915059 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" event={"ID":"b1538e13-4b0e-4bb9-9277-3d0475cd41a4","Type":"ContainerStarted","Data":"3402a96ec3bf9033ab914a5948d66d37afd8fcc27b5ef05ad181c9db208ad1d6"} Sep 30 13:53:19 crc kubenswrapper[4936]: E0930 13:53:19.918588 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" podUID="b1538e13-4b0e-4bb9-9277-3d0475cd41a4" Sep 30 13:53:19 crc kubenswrapper[4936]: I0930 13:53:19.974074 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-64c94699b9-lqnhn" event={"ID":"b4e378e0-0a69-47c9-b80f-fee159c4ad5b","Type":"ContainerStarted","Data":"a2bb5224e773d825742c2df7dcc18ea7f5ad2e02641e6478a2bcaad6b3be8c89"} Sep 30 13:53:19 crc kubenswrapper[4936]: I0930 13:53:19.974114 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-64c94699b9-lqnhn" event={"ID":"b4e378e0-0a69-47c9-b80f-fee159c4ad5b","Type":"ContainerStarted","Data":"4ba7eeb8a34f22a2ba0d88c22a9f5204adaef1c9a4049b7d67a64015fe577cc9"} Sep 30 13:53:19 crc kubenswrapper[4936]: E0930 13:53:19.978680 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-d2tm5" podUID="639a60da-010a-40d5-bfec-6219ef3f712b" Sep 30 13:53:19 crc kubenswrapper[4936]: E0930 13:53:19.979186 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" podUID="7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6" Sep 30 13:53:21 crc kubenswrapper[4936]: I0930 13:53:21.018649 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-64c94699b9-lqnhn" event={"ID":"b4e378e0-0a69-47c9-b80f-fee159c4ad5b","Type":"ContainerStarted","Data":"84ded35f30e66ad9f383a17ade679fb46f30bc8e827688878977169a93b3144a"} Sep 30 13:53:21 crc kubenswrapper[4936]: I0930 13:53:21.018979 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-64c94699b9-lqnhn" Sep 30 13:53:21 crc kubenswrapper[4936]: E0930 13:53:21.047549 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" podUID="7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6" Sep 30 13:53:21 crc kubenswrapper[4936]: E0930 13:53:21.047656 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" podUID="b1538e13-4b0e-4bb9-9277-3d0475cd41a4" Sep 30 13:53:21 crc kubenswrapper[4936]: I0930 13:53:21.168442 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-64c94699b9-lqnhn" podStartSLOduration=6.168409493 podStartE2EDuration="6.168409493s" podCreationTimestamp="2025-09-30 13:53:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:53:21.111887888 +0000 UTC m=+851.495890189" watchObservedRunningTime="2025-09-30 13:53:21.168409493 +0000 UTC m=+851.552411794" Sep 30 13:53:21 crc kubenswrapper[4936]: I0930 13:53:21.445228 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fwh6j" Sep 30 13:53:21 crc kubenswrapper[4936]: I0930 13:53:21.445467 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fwh6j" Sep 30 13:53:22 crc kubenswrapper[4936]: I0930 13:53:22.605751 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-fwh6j" podUID="a2866263-7d1e-47c3-8f8f-25bb43505e5e" containerName="registry-server" probeResult="failure" output=< Sep 30 13:53:22 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 13:53:22 crc kubenswrapper[4936]: > Sep 30 13:53:27 crc kubenswrapper[4936]: I0930 13:53:27.839330 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-64c94699b9-lqnhn" Sep 30 13:53:31 crc kubenswrapper[4936]: I0930 13:53:31.506242 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fwh6j" Sep 30 13:53:31 crc kubenswrapper[4936]: I0930 13:53:31.564075 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fwh6j" Sep 30 13:53:31 crc kubenswrapper[4936]: I0930 13:53:31.743248 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fwh6j"] Sep 30 13:53:33 crc kubenswrapper[4936]: I0930 13:53:33.133351 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fwh6j" podUID="a2866263-7d1e-47c3-8f8f-25bb43505e5e" containerName="registry-server" containerID="cri-o://a228f6c3800cc8ba4db4e093aef27a41717ea0092ac07af8187408ce9a6cf272" gracePeriod=2 Sep 30 13:53:34 crc kubenswrapper[4936]: E0930 13:53:34.968018 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c" Sep 30 13:53:34 crc kubenswrapper[4936]: E0930 13:53:34.968936 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7fzmz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-bc7dc7bd9-n6mv9_openstack-operators(bbbf4ed1-241b-4c4e-80e0-77acd778b868): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:53:35 crc kubenswrapper[4936]: I0930 13:53:35.149558 4936 generic.go:334] "Generic (PLEG): container finished" podID="a2866263-7d1e-47c3-8f8f-25bb43505e5e" containerID="a228f6c3800cc8ba4db4e093aef27a41717ea0092ac07af8187408ce9a6cf272" exitCode=0 Sep 30 13:53:35 crc kubenswrapper[4936]: I0930 13:53:35.149600 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwh6j" event={"ID":"a2866263-7d1e-47c3-8f8f-25bb43505e5e","Type":"ContainerDied","Data":"a228f6c3800cc8ba4db4e093aef27a41717ea0092ac07af8187408ce9a6cf272"} Sep 30 13:53:35 crc kubenswrapper[4936]: E0930 13:53:35.803011 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2" Sep 30 13:53:35 crc kubenswrapper[4936]: E0930 13:53:35.803180 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s7xxv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-589c58c6c-sxfzz_openstack-operators(699f5243-7ea5-4f7f-a537-51a99a871ccb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:53:36 crc kubenswrapper[4936]: E0930 13:53:36.488652 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:1e2c65f4331a2bb568d97fbcd02e3bca2627e133a794e1e4fd13368e86ce6bd1" Sep 30 13:53:36 crc kubenswrapper[4936]: E0930 13:53:36.488816 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:1e2c65f4331a2bb568d97fbcd02e3bca2627e133a794e1e4fd13368e86ce6bd1,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cpf5f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-644bddb6d8-6c8cz_openstack-operators(7cc732ee-78f8-4d20-aac8-67ab10b944d3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:53:38 crc kubenswrapper[4936]: E0930 13:53:38.639711 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:a303e460aec09217f90043b8ff19c01061af003b614833b33a593df9c00ddf80" Sep 30 13:53:38 crc kubenswrapper[4936]: E0930 13:53:38.640144 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:a303e460aec09217f90043b8ff19c01061af003b614833b33a593df9c00ddf80,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5fhp4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-f66b554c6-x2csq_openstack-operators(673847ae-740d-4a3b-ad7e-09ec8848199d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:53:39 crc kubenswrapper[4936]: E0930 13:53:39.332271 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:f6b935f67979298c3c263ad84d277e5cf26c0dbba3f85f255c1ec4d1d75241d2" Sep 30 13:53:39 crc kubenswrapper[4936]: E0930 13:53:39.332637 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:f6b935f67979298c3c263ad84d277e5cf26c0dbba3f85f255c1ec4d1d75241d2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v6xjw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-84f4f7b77b-xz9zg_openstack-operators(9d8425ad-dcdc-4d31-9a5c-9461adb3296c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:53:40 crc kubenswrapper[4936]: E0930 13:53:40.729127 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:21792a2317c0a55e40b2a02a7d5d4682b76538ed2a2e0633199aa395e60ecc72" Sep 30 13:53:40 crc kubenswrapper[4936]: E0930 13:53:40.729504 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:21792a2317c0a55e40b2a02a7d5d4682b76538ed2a2e0633199aa395e60ecc72,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bc9kl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-84958c4d49-fk2zk_openstack-operators(bd9be0ef-9048-4e0a-b8d7-1b29b450984f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:53:41 crc kubenswrapper[4936]: E0930 13:53:41.163436 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302" Sep 30 13:53:41 crc kubenswrapper[4936]: E0930 13:53:41.163653 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n2xf4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-9976ff44c-sthwj_openstack-operators(85448d25-86e9-4a2f-bc5a-339ab3d2112a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:53:41 crc kubenswrapper[4936]: E0930 13:53:41.445304 4936 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a228f6c3800cc8ba4db4e093aef27a41717ea0092ac07af8187408ce9a6cf272 is running failed: container process not found" containerID="a228f6c3800cc8ba4db4e093aef27a41717ea0092ac07af8187408ce9a6cf272" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 13:53:41 crc kubenswrapper[4936]: E0930 13:53:41.445747 4936 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a228f6c3800cc8ba4db4e093aef27a41717ea0092ac07af8187408ce9a6cf272 is running failed: container process not found" containerID="a228f6c3800cc8ba4db4e093aef27a41717ea0092ac07af8187408ce9a6cf272" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 13:53:41 crc kubenswrapper[4936]: E0930 13:53:41.445960 4936 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a228f6c3800cc8ba4db4e093aef27a41717ea0092ac07af8187408ce9a6cf272 is running failed: container process not found" containerID="a228f6c3800cc8ba4db4e093aef27a41717ea0092ac07af8187408ce9a6cf272" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 13:53:41 crc kubenswrapper[4936]: E0930 13:53:41.446022 4936 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a228f6c3800cc8ba4db4e093aef27a41717ea0092ac07af8187408ce9a6cf272 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-fwh6j" podUID="a2866263-7d1e-47c3-8f8f-25bb43505e5e" containerName="registry-server" Sep 30 13:53:41 crc kubenswrapper[4936]: E0930 13:53:41.772323 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:917e6dcc519277c46e42898bc9f0f066790fa7b9633fcde668cc8a68a547c13c" Sep 30 13:53:41 crc kubenswrapper[4936]: E0930 13:53:41.772527 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:917e6dcc519277c46e42898bc9f0f066790fa7b9633fcde668cc8a68a547c13c,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vpt57,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5d889d78cf-g5k7h_openstack-operators(f632d83e-2c2c-4c90-8fea-5747d58633d6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:53:44 crc kubenswrapper[4936]: E0930 13:53:44.978236 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8" Sep 30 13:53:44 crc kubenswrapper[4936]: E0930 13:53:44.979701 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8xgcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-76fcc6dc7c-lhbld_openstack-operators(d50d2534-deec-4173-a73a-d10b3beac452): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:53:45 crc kubenswrapper[4936]: E0930 13:53:45.520705 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884" Sep 30 13:53:45 crc kubenswrapper[4936]: E0930 13:53:45.520930 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vbjc6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-6d68dbc695-sxftv_openstack-operators(d08bae3c-64f1-46de-ab2c-d6b2407c2d95): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:53:45 crc kubenswrapper[4936]: E0930 13:53:45.899050 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:bb39758cc8cd0d2cd02841dc81b53fd88647e2db15ee16cdd8c44d4098a942fd" Sep 30 13:53:45 crc kubenswrapper[4936]: E0930 13:53:45.899226 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:bb39758cc8cd0d2cd02841dc81b53fd88647e2db15ee16cdd8c44d4098a942fd,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v2zdj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-6ff8b75857-x5g5r_openstack-operators(650ff8e9-279f-41ff-8bb8-1880e7cf985c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:53:47 crc kubenswrapper[4936]: E0930 13:53:47.040040 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f" Sep 30 13:53:47 crc kubenswrapper[4936]: E0930 13:53:47.040926 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7mxj2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-b8d54b5d7-7rfmc_openstack-operators(78f55939-d8fc-40d5-bc8e-a3f87b962b34): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:53:47 crc kubenswrapper[4936]: E0930 13:53:47.402684 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a" Sep 30 13:53:47 crc kubenswrapper[4936]: E0930 13:53:47.402910 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mf9np,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-88c7-8tpqq_openstack-operators(847b4871-2d23-4790-b32a-b42698008fee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:53:48 crc kubenswrapper[4936]: E0930 13:53:48.760567 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b" Sep 30 13:53:48 crc kubenswrapper[4936]: E0930 13:53:48.761003 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8xwt5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-76669f99c-d2tm5_openstack-operators(639a60da-010a-40d5-bfec-6219ef3f712b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:53:48 crc kubenswrapper[4936]: E0930 13:53:48.762349 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-d2tm5" podUID="639a60da-010a-40d5-bfec-6219ef3f712b" Sep 30 13:53:49 crc kubenswrapper[4936]: E0930 13:53:49.150777 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f" Sep 30 13:53:49 crc kubenswrapper[4936]: E0930 13:53:49.150981 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rzm2v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-7d857cc749-hwpcz_openstack-operators(7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:53:49 crc kubenswrapper[4936]: E0930 13:53:49.152174 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" podUID="7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6" Sep 30 13:53:49 crc kubenswrapper[4936]: E0930 13:53:49.490719 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6" Sep 30 13:53:49 crc kubenswrapper[4936]: E0930 13:53:49.493384 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_LIGHTSPEED_IMAGE_URL_DEFAULT,Value:quay.io/openstack-lightspeed/rag-content:os-docs-2024.2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l2hcb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-6d776955-5mdhl_openstack-operators(b1538e13-4b0e-4bb9-9277-3d0475cd41a4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:53:49 crc kubenswrapper[4936]: E0930 13:53:49.495450 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" podUID="b1538e13-4b0e-4bb9-9277-3d0475cd41a4" Sep 30 13:53:49 crc kubenswrapper[4936]: E0930 13:53:49.884261 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b" Sep 30 13:53:49 crc kubenswrapper[4936]: E0930 13:53:49.884529 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xvg2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-79d8469568-49bq2_openstack-operators(d93cfe4f-caf4-4b23-9d9c-0aa14cb5bc28): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:53:49 crc kubenswrapper[4936]: E0930 13:53:49.886640 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-49bq2" podUID="d93cfe4f-caf4-4b23-9d9c-0aa14cb5bc28" Sep 30 13:53:49 crc kubenswrapper[4936]: I0930 13:53:49.954682 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fwh6j" Sep 30 13:53:50 crc kubenswrapper[4936]: E0930 13:53:50.059645 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-6c8cz" podUID="7cc732ee-78f8-4d20-aac8-67ab10b944d3" Sep 30 13:53:50 crc kubenswrapper[4936]: I0930 13:53:50.126833 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2866263-7d1e-47c3-8f8f-25bb43505e5e-catalog-content\") pod \"a2866263-7d1e-47c3-8f8f-25bb43505e5e\" (UID: \"a2866263-7d1e-47c3-8f8f-25bb43505e5e\") " Sep 30 13:53:50 crc kubenswrapper[4936]: I0930 13:53:50.127032 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8z4s\" (UniqueName: \"kubernetes.io/projected/a2866263-7d1e-47c3-8f8f-25bb43505e5e-kube-api-access-r8z4s\") pod \"a2866263-7d1e-47c3-8f8f-25bb43505e5e\" (UID: \"a2866263-7d1e-47c3-8f8f-25bb43505e5e\") " Sep 30 13:53:50 crc kubenswrapper[4936]: I0930 13:53:50.127098 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2866263-7d1e-47c3-8f8f-25bb43505e5e-utilities\") pod \"a2866263-7d1e-47c3-8f8f-25bb43505e5e\" (UID: \"a2866263-7d1e-47c3-8f8f-25bb43505e5e\") " Sep 30 13:53:50 crc kubenswrapper[4936]: I0930 13:53:50.127732 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2866263-7d1e-47c3-8f8f-25bb43505e5e-utilities" (OuterVolumeSpecName: "utilities") pod "a2866263-7d1e-47c3-8f8f-25bb43505e5e" (UID: "a2866263-7d1e-47c3-8f8f-25bb43505e5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:53:50 crc kubenswrapper[4936]: I0930 13:53:50.138647 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2866263-7d1e-47c3-8f8f-25bb43505e5e-kube-api-access-r8z4s" (OuterVolumeSpecName: "kube-api-access-r8z4s") pod "a2866263-7d1e-47c3-8f8f-25bb43505e5e" (UID: "a2866263-7d1e-47c3-8f8f-25bb43505e5e"). InnerVolumeSpecName "kube-api-access-r8z4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:50 crc kubenswrapper[4936]: I0930 13:53:50.180412 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2866263-7d1e-47c3-8f8f-25bb43505e5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2866263-7d1e-47c3-8f8f-25bb43505e5e" (UID: "a2866263-7d1e-47c3-8f8f-25bb43505e5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:53:50 crc kubenswrapper[4936]: I0930 13:53:50.229140 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2866263-7d1e-47c3-8f8f-25bb43505e5e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:50 crc kubenswrapper[4936]: I0930 13:53:50.229174 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2866263-7d1e-47c3-8f8f-25bb43505e5e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:50 crc kubenswrapper[4936]: I0930 13:53:50.229188 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8z4s\" (UniqueName: \"kubernetes.io/projected/a2866263-7d1e-47c3-8f8f-25bb43505e5e-kube-api-access-r8z4s\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:50 crc kubenswrapper[4936]: I0930 13:53:50.264620 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwh6j" event={"ID":"a2866263-7d1e-47c3-8f8f-25bb43505e5e","Type":"ContainerDied","Data":"95b9c79ae1551c262581571be1a8ec596992214019c645f36c9a8e514eab3850"} Sep 30 13:53:50 crc kubenswrapper[4936]: I0930 13:53:50.264651 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fwh6j" Sep 30 13:53:50 crc kubenswrapper[4936]: I0930 13:53:50.264680 4936 scope.go:117] "RemoveContainer" containerID="a228f6c3800cc8ba4db4e093aef27a41717ea0092ac07af8187408ce9a6cf272" Sep 30 13:53:50 crc kubenswrapper[4936]: I0930 13:53:50.268018 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-6c8cz" event={"ID":"7cc732ee-78f8-4d20-aac8-67ab10b944d3","Type":"ContainerStarted","Data":"3af49b3d1fc55accca9905c4d20d3a1202c8ffd02f9feb93943d1c5afffafc10"} Sep 30 13:53:50 crc kubenswrapper[4936]: E0930 13:53:50.271935 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-49bq2" podUID="d93cfe4f-caf4-4b23-9d9c-0aa14cb5bc28" Sep 30 13:53:50 crc kubenswrapper[4936]: I0930 13:53:50.347499 4936 scope.go:117] "RemoveContainer" containerID="a2fba98cf427b008158fcc51b77e456c6b30ed1d08794f4d00319d3d0b7704e6" Sep 30 13:53:50 crc kubenswrapper[4936]: I0930 13:53:50.374651 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fwh6j"] Sep 30 13:53:50 crc kubenswrapper[4936]: I0930 13:53:50.389925 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fwh6j"] Sep 30 13:53:50 crc kubenswrapper[4936]: I0930 13:53:50.414517 4936 scope.go:117] "RemoveContainer" containerID="b51d821947355c45a982a5cfd0f06c341f0dbb600f2fdca3c963c983f65cfc2c" Sep 30 13:53:50 crc kubenswrapper[4936]: E0930 13:53:50.521719 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xz9zg" podUID="9d8425ad-dcdc-4d31-9a5c-9461adb3296c" Sep 30 13:53:50 crc kubenswrapper[4936]: E0930 13:53:50.553161 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fk2zk" podUID="bd9be0ef-9048-4e0a-b8d7-1b29b450984f" Sep 30 13:53:50 crc kubenswrapper[4936]: E0930 13:53:50.568880 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-n6mv9" podUID="bbbf4ed1-241b-4c4e-80e0-77acd778b868" Sep 30 13:53:50 crc kubenswrapper[4936]: E0930 13:53:50.574155 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-sthwj" podUID="85448d25-86e9-4a2f-bc5a-339ab3d2112a" Sep 30 13:53:50 crc kubenswrapper[4936]: E0930 13:53:50.654175 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-f66b554c6-x2csq" podUID="673847ae-740d-4a3b-ad7e-09ec8848199d" Sep 30 13:53:50 crc kubenswrapper[4936]: E0930 13:53:50.704105 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-sxfzz" podUID="699f5243-7ea5-4f7f-a537-51a99a871ccb" Sep 30 13:53:50 crc kubenswrapper[4936]: E0930 13:53:50.736858 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-7rfmc" podUID="78f55939-d8fc-40d5-bc8e-a3f87b962b34" Sep 30 13:53:50 crc kubenswrapper[4936]: E0930 13:53:50.756975 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-lhbld" podUID="d50d2534-deec-4173-a73a-d10b3beac452" Sep 30 13:53:50 crc kubenswrapper[4936]: E0930 13:53:50.802593 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8tpqq" podUID="847b4871-2d23-4790-b32a-b42698008fee" Sep 30 13:53:50 crc kubenswrapper[4936]: E0930 13:53:50.910571 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-g5k7h" podUID="f632d83e-2c2c-4c90-8fea-5747d58633d6" Sep 30 13:53:50 crc kubenswrapper[4936]: E0930 13:53:50.924131 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-x5g5r" podUID="650ff8e9-279f-41ff-8bb8-1880e7cf985c" Sep 30 13:53:50 crc kubenswrapper[4936]: E0930 13:53:50.944806 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-sxftv" podUID="d08bae3c-64f1-46de-ab2c-d6b2407c2d95" Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.288816 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-x5g5r" event={"ID":"650ff8e9-279f-41ff-8bb8-1880e7cf985c","Type":"ContainerStarted","Data":"ca5ad05c920b9ff532652c513cd683efdd49f43313fec382f8de625caa823c13"} Sep 30 13:53:51 crc kubenswrapper[4936]: E0930 13:53:51.292212 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:bb39758cc8cd0d2cd02841dc81b53fd88647e2db15ee16cdd8c44d4098a942fd\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-x5g5r" podUID="650ff8e9-279f-41ff-8bb8-1880e7cf985c" Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.318644 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-n28ht" event={"ID":"189d95a0-9ee0-4055-86c2-724082d46a11","Type":"ContainerStarted","Data":"c9aafb2852d71e2e17671c04ce0e277a499a7ebcc72f7574673dbb116a968714"} Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.318706 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-n28ht" event={"ID":"189d95a0-9ee0-4055-86c2-724082d46a11","Type":"ContainerStarted","Data":"3c904036c6fb62181766c094131865c0403b904b0a02611415dccc71ad692fd5"} Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.373037 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-zvzfj" event={"ID":"5ec29297-f0db-497d-aa05-e939e9aef380","Type":"ContainerStarted","Data":"76969b641ca49d4dbe94532576922743c830d02cd9d132a5ee38ebe8591f5c8c"} Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.374248 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-zvzfj" Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.392198 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-n28ht" podStartSLOduration=4.249833102 podStartE2EDuration="37.392174819s" podCreationTimestamp="2025-09-30 13:53:14 +0000 UTC" firstStartedPulling="2025-09-30 13:53:16.339005585 +0000 UTC m=+846.723007886" lastFinishedPulling="2025-09-30 13:53:49.481347292 +0000 UTC m=+879.865349603" observedRunningTime="2025-09-30 13:53:51.371813971 +0000 UTC m=+881.755816282" watchObservedRunningTime="2025-09-30 13:53:51.392174819 +0000 UTC m=+881.776177120" Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.397719 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-hgdlx" event={"ID":"c54f70b2-5767-4616-878b-5816861d2637","Type":"ContainerStarted","Data":"18664173a3d3db0900be2e7f7b2a21915ca4f79dc02f7ae885c403463c893cb6"} Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.397965 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-hgdlx" event={"ID":"c54f70b2-5767-4616-878b-5816861d2637","Type":"ContainerStarted","Data":"e0958f8830c9ee214dad232f7e8a64941409b3cd218564ac5e8efaebb5febf7b"} Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.399031 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-hgdlx" Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.432231 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-zvzfj" podStartSLOduration=5.209315656 podStartE2EDuration="37.432214976s" podCreationTimestamp="2025-09-30 13:53:14 +0000 UTC" firstStartedPulling="2025-09-30 13:53:17.257642639 +0000 UTC m=+847.641644930" lastFinishedPulling="2025-09-30 13:53:49.480541949 +0000 UTC m=+879.864544250" observedRunningTime="2025-09-30 13:53:51.430750275 +0000 UTC m=+881.814752576" watchObservedRunningTime="2025-09-30 13:53:51.432214976 +0000 UTC m=+881.816217277" Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.434415 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-7rfmc" event={"ID":"78f55939-d8fc-40d5-bc8e-a3f87b962b34","Type":"ContainerStarted","Data":"d352be81d45dee556357c71e19c30da852277c42e503d426aab71c539a629a97"} Sep 30 13:53:51 crc kubenswrapper[4936]: E0930 13:53:51.436144 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-7rfmc" podUID="78f55939-d8fc-40d5-bc8e-a3f87b962b34" Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.448376 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-lhbld" event={"ID":"d50d2534-deec-4173-a73a-d10b3beac452","Type":"ContainerStarted","Data":"dbe360bf26ec2e7f25afc182ab94b11491e8d8b874a88884797790c2458c04b0"} Sep 30 13:53:51 crc kubenswrapper[4936]: E0930 13:53:51.453426 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-lhbld" podUID="d50d2534-deec-4173-a73a-d10b3beac452" Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.459090 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-hgdlx" podStartSLOduration=5.325273639 podStartE2EDuration="37.459073179s" podCreationTimestamp="2025-09-30 13:53:14 +0000 UTC" firstStartedPulling="2025-09-30 13:53:17.346725559 +0000 UTC m=+847.730727860" lastFinishedPulling="2025-09-30 13:53:49.480525099 +0000 UTC m=+879.864527400" observedRunningTime="2025-09-30 13:53:51.457640478 +0000 UTC m=+881.841642779" watchObservedRunningTime="2025-09-30 13:53:51.459073179 +0000 UTC m=+881.843075470" Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.469972 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9jjnj" event={"ID":"13bd8563-ccb1-4445-b613-495e801195a4","Type":"ContainerStarted","Data":"32fde13dcf37ff88e19cb81503b92cc71fac53b9cda71e40a7b026d7cadb21e2"} Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.471058 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9jjnj" Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.493682 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kldb" event={"ID":"e546e1bb-9ee4-4549-9521-76d122b4edf5","Type":"ContainerStarted","Data":"a98061597d9cd31c76096a91b68896f896310705fc3176c434a338503d36efc7"} Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.508711 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8tpqq" event={"ID":"847b4871-2d23-4790-b32a-b42698008fee","Type":"ContainerStarted","Data":"4462b2ea7f84a532c42ee33cf7dce0543da72082fc96763a19a29eaf1200151d"} Sep 30 13:53:51 crc kubenswrapper[4936]: E0930 13:53:51.509925 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8tpqq" podUID="847b4871-2d23-4790-b32a-b42698008fee" Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.516273 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xz9zg" event={"ID":"9d8425ad-dcdc-4d31-9a5c-9461adb3296c","Type":"ContainerStarted","Data":"7135f0b30b01c9b53a65e7ff3d94d80b86054710ca3d48faf0fa40ee2dcb6730"} Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.536863 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-g5k7h" event={"ID":"f632d83e-2c2c-4c90-8fea-5747d58633d6","Type":"ContainerStarted","Data":"62b65aa5455232d7ce04bc5bff0e8421555c6d19d49f01e36f521d454f4c1f5f"} Sep 30 13:53:51 crc kubenswrapper[4936]: E0930 13:53:51.545227 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:917e6dcc519277c46e42898bc9f0f066790fa7b9633fcde668cc8a68a547c13c\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-g5k7h" podUID="f632d83e-2c2c-4c90-8fea-5747d58633d6" Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.551425 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-x2csq" event={"ID":"673847ae-740d-4a3b-ad7e-09ec8848199d","Type":"ContainerStarted","Data":"b47237ffc62387d209fa05fd35c5d04ffe1419b9aff5fb01458bf98ba53ae8dd"} Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.564981 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9jjnj" podStartSLOduration=5.227147643 podStartE2EDuration="37.564966896s" podCreationTimestamp="2025-09-30 13:53:14 +0000 UTC" firstStartedPulling="2025-09-30 13:53:17.142704256 +0000 UTC m=+847.526706557" lastFinishedPulling="2025-09-30 13:53:49.480523519 +0000 UTC m=+879.864525810" observedRunningTime="2025-09-30 13:53:51.563387521 +0000 UTC m=+881.947389822" watchObservedRunningTime="2025-09-30 13:53:51.564966896 +0000 UTC m=+881.948969197" Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.571086 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-sxfzz" event={"ID":"699f5243-7ea5-4f7f-a537-51a99a871ccb","Type":"ContainerStarted","Data":"dba340081f3bc90a7dc4b56ffce77947e83359aa82ffbb6a106969f323fb9f29"} Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.585736 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-sthwj" event={"ID":"85448d25-86e9-4a2f-bc5a-339ab3d2112a","Type":"ContainerStarted","Data":"1e24cdd53641717e0f4c7cf4f82f72cf184d0932847129b8b6f0607efedc1595"} Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.590326 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-n6mv9" event={"ID":"bbbf4ed1-241b-4c4e-80e0-77acd778b868","Type":"ContainerStarted","Data":"27efd889c80d1f951d50ecbc8c72285d3e9edc44c3b0534eda9a34d000944c69"} Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.611782 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-sxftv" event={"ID":"d08bae3c-64f1-46de-ab2c-d6b2407c2d95","Type":"ContainerStarted","Data":"a39214861683729e3737607bac2a431615fc720e613e45ce4aa710e44209f2e8"} Sep 30 13:53:51 crc kubenswrapper[4936]: E0930 13:53:51.616635 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884\\\"\"" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-sxftv" podUID="d08bae3c-64f1-46de-ab2c-d6b2407c2d95" Sep 30 13:53:51 crc kubenswrapper[4936]: I0930 13:53:51.622792 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fk2zk" event={"ID":"bd9be0ef-9048-4e0a-b8d7-1b29b450984f","Type":"ContainerStarted","Data":"2784595f4291ab6892be9a654b31e29308d761eee69d0b3d8c9ada53b5c710b0"} Sep 30 13:53:52 crc kubenswrapper[4936]: I0930 13:53:52.324063 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2866263-7d1e-47c3-8f8f-25bb43505e5e" path="/var/lib/kubelet/pods/a2866263-7d1e-47c3-8f8f-25bb43505e5e/volumes" Sep 30 13:53:52 crc kubenswrapper[4936]: I0930 13:53:52.635775 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-zvzfj" event={"ID":"5ec29297-f0db-497d-aa05-e939e9aef380","Type":"ContainerStarted","Data":"e22dde3d4c5f41f2c861c22ed1f29bfdc4b5953bdcd8a37d045e9962d86da0fb"} Sep 30 13:53:52 crc kubenswrapper[4936]: I0930 13:53:52.638118 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-6c8cz" event={"ID":"7cc732ee-78f8-4d20-aac8-67ab10b944d3","Type":"ContainerStarted","Data":"0c27130909c7c7fe9e5b4873424abb0b0bc101ca59dfa3d9f61662d29e1f09a5"} Sep 30 13:53:52 crc kubenswrapper[4936]: I0930 13:53:52.638258 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-6c8cz" Sep 30 13:53:52 crc kubenswrapper[4936]: I0930 13:53:52.639743 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9jjnj" event={"ID":"13bd8563-ccb1-4445-b613-495e801195a4","Type":"ContainerStarted","Data":"26037b7562d7b51950fcd60d93dbc81727e211d1df7bade3170b7ff05c7a0a13"} Sep 30 13:53:52 crc kubenswrapper[4936]: I0930 13:53:52.642160 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kldb" event={"ID":"e546e1bb-9ee4-4549-9521-76d122b4edf5","Type":"ContainerStarted","Data":"96700c88cade2864b997100eb099d5aaadb829de2b380c293bc0a045b0b8635d"} Sep 30 13:53:52 crc kubenswrapper[4936]: I0930 13:53:52.642587 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-n28ht" Sep 30 13:53:52 crc kubenswrapper[4936]: E0930 13:53:52.643949 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-lhbld" podUID="d50d2534-deec-4173-a73a-d10b3beac452" Sep 30 13:53:52 crc kubenswrapper[4936]: E0930 13:53:52.643974 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884\\\"\"" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-sxftv" podUID="d08bae3c-64f1-46de-ab2c-d6b2407c2d95" Sep 30 13:53:52 crc kubenswrapper[4936]: E0930 13:53:52.644199 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8tpqq" podUID="847b4871-2d23-4790-b32a-b42698008fee" Sep 30 13:53:52 crc kubenswrapper[4936]: E0930 13:53:52.644314 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-7rfmc" podUID="78f55939-d8fc-40d5-bc8e-a3f87b962b34" Sep 30 13:53:52 crc kubenswrapper[4936]: E0930 13:53:52.644321 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:bb39758cc8cd0d2cd02841dc81b53fd88647e2db15ee16cdd8c44d4098a942fd\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-x5g5r" podUID="650ff8e9-279f-41ff-8bb8-1880e7cf985c" Sep 30 13:53:52 crc kubenswrapper[4936]: I0930 13:53:52.663131 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-6c8cz" podStartSLOduration=4.40901179 podStartE2EDuration="38.663112646s" podCreationTimestamp="2025-09-30 13:53:14 +0000 UTC" firstStartedPulling="2025-09-30 13:53:16.734406452 +0000 UTC m=+847.118408743" lastFinishedPulling="2025-09-30 13:53:50.988507298 +0000 UTC m=+881.372509599" observedRunningTime="2025-09-30 13:53:52.65900417 +0000 UTC m=+883.043006491" watchObservedRunningTime="2025-09-30 13:53:52.663112646 +0000 UTC m=+883.047114947" Sep 30 13:53:52 crc kubenswrapper[4936]: I0930 13:53:52.717683 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kldb" podStartSLOduration=6.973379785 podStartE2EDuration="38.717665775s" podCreationTimestamp="2025-09-30 13:53:14 +0000 UTC" firstStartedPulling="2025-09-30 13:53:17.74086574 +0000 UTC m=+848.124868041" lastFinishedPulling="2025-09-30 13:53:49.48515172 +0000 UTC m=+879.869154031" observedRunningTime="2025-09-30 13:53:52.711972854 +0000 UTC m=+883.095975145" watchObservedRunningTime="2025-09-30 13:53:52.717665775 +0000 UTC m=+883.101668076" Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.650605 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fk2zk" event={"ID":"bd9be0ef-9048-4e0a-b8d7-1b29b450984f","Type":"ContainerStarted","Data":"e8262cfd89858331765fe44aba924a24fbb42b92bf31b47d225de02ee996cce4"} Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.651481 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fk2zk" Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.652872 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-n6mv9" event={"ID":"bbbf4ed1-241b-4c4e-80e0-77acd778b868","Type":"ContainerStarted","Data":"a56538eb1684062ca96571b5912c931e13ae0c43102f948f4658ff9110e7c35e"} Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.653243 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-n6mv9" Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.655601 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-sthwj" event={"ID":"85448d25-86e9-4a2f-bc5a-339ab3d2112a","Type":"ContainerStarted","Data":"b75e307e4160779a60be722766967b173838e44e5d67c5cb374516e6833046c8"} Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.656116 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-sthwj" Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.657324 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xz9zg" event={"ID":"9d8425ad-dcdc-4d31-9a5c-9461adb3296c","Type":"ContainerStarted","Data":"6dd309a0f5745cd3e0f421b28ab8b58ca18c33cff9b1d624cab09b38f9326153"} Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.657911 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xz9zg" Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.660960 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-g5k7h" event={"ID":"f632d83e-2c2c-4c90-8fea-5747d58633d6","Type":"ContainerStarted","Data":"b31463b246d8891084ab29fdef4264c53e3648c9ead192068c4fe06a378654fc"} Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.661273 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-g5k7h" Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.668237 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-x2csq" event={"ID":"673847ae-740d-4a3b-ad7e-09ec8848199d","Type":"ContainerStarted","Data":"52f928a54ef92c3033a2ca601df7997f526aa2b1a684537e9954701eec444a04"} Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.669188 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-f66b554c6-x2csq" Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.675936 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-sxfzz" event={"ID":"699f5243-7ea5-4f7f-a537-51a99a871ccb","Type":"ContainerStarted","Data":"27f8a5217c328ec6953a88cb22d40363612573280c4553e45d28ab5ade07fa17"} Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.676276 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fk2zk" podStartSLOduration=3.9057089 podStartE2EDuration="39.676262504s" podCreationTimestamp="2025-09-30 13:53:14 +0000 UTC" firstStartedPulling="2025-09-30 13:53:17.150541998 +0000 UTC m=+847.534544299" lastFinishedPulling="2025-09-30 13:53:52.921095602 +0000 UTC m=+883.305097903" observedRunningTime="2025-09-30 13:53:53.672379054 +0000 UTC m=+884.056381375" watchObservedRunningTime="2025-09-30 13:53:53.676262504 +0000 UTC m=+884.060264805" Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.676980 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kldb" Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.743215 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xz9zg" podStartSLOduration=3.857277716 podStartE2EDuration="39.743191915s" podCreationTimestamp="2025-09-30 13:53:14 +0000 UTC" firstStartedPulling="2025-09-30 13:53:17.038830766 +0000 UTC m=+847.422833067" lastFinishedPulling="2025-09-30 13:53:52.924744965 +0000 UTC m=+883.308747266" observedRunningTime="2025-09-30 13:53:53.737431111 +0000 UTC m=+884.121433422" watchObservedRunningTime="2025-09-30 13:53:53.743191915 +0000 UTC m=+884.127194226" Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.764850 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-sthwj" podStartSLOduration=3.658828415 podStartE2EDuration="38.764834729s" podCreationTimestamp="2025-09-30 13:53:15 +0000 UTC" firstStartedPulling="2025-09-30 13:53:17.81376477 +0000 UTC m=+848.197767071" lastFinishedPulling="2025-09-30 13:53:52.919771084 +0000 UTC m=+883.303773385" observedRunningTime="2025-09-30 13:53:53.760794034 +0000 UTC m=+884.144796335" watchObservedRunningTime="2025-09-30 13:53:53.764834729 +0000 UTC m=+884.148837030" Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.785109 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-g5k7h" podStartSLOduration=3.946684984 podStartE2EDuration="39.785085654s" podCreationTimestamp="2025-09-30 13:53:14 +0000 UTC" firstStartedPulling="2025-09-30 13:53:17.286620382 +0000 UTC m=+847.670622683" lastFinishedPulling="2025-09-30 13:53:53.125021052 +0000 UTC m=+883.509023353" observedRunningTime="2025-09-30 13:53:53.779207657 +0000 UTC m=+884.163209978" watchObservedRunningTime="2025-09-30 13:53:53.785085654 +0000 UTC m=+884.169087955" Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.800964 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-n6mv9" podStartSLOduration=3.694460067 podStartE2EDuration="38.800949185s" podCreationTimestamp="2025-09-30 13:53:15 +0000 UTC" firstStartedPulling="2025-09-30 13:53:17.81482133 +0000 UTC m=+848.198823631" lastFinishedPulling="2025-09-30 13:53:52.921310448 +0000 UTC m=+883.305312749" observedRunningTime="2025-09-30 13:53:53.799156784 +0000 UTC m=+884.183159095" watchObservedRunningTime="2025-09-30 13:53:53.800949185 +0000 UTC m=+884.184951476" Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.856681 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-f66b554c6-x2csq" podStartSLOduration=3.889223548 podStartE2EDuration="38.856664017s" podCreationTimestamp="2025-09-30 13:53:15 +0000 UTC" firstStartedPulling="2025-09-30 13:53:18.053036124 +0000 UTC m=+848.437038425" lastFinishedPulling="2025-09-30 13:53:53.020476593 +0000 UTC m=+883.404478894" observedRunningTime="2025-09-30 13:53:53.825374148 +0000 UTC m=+884.209376449" watchObservedRunningTime="2025-09-30 13:53:53.856664017 +0000 UTC m=+884.240666318" Sep 30 13:53:53 crc kubenswrapper[4936]: I0930 13:53:53.860511 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-sxfzz" podStartSLOduration=3.733747703 podStartE2EDuration="38.860499426s" podCreationTimestamp="2025-09-30 13:53:15 +0000 UTC" firstStartedPulling="2025-09-30 13:53:17.795844141 +0000 UTC m=+848.179846442" lastFinishedPulling="2025-09-30 13:53:52.922595864 +0000 UTC m=+883.306598165" observedRunningTime="2025-09-30 13:53:53.85432563 +0000 UTC m=+884.238327931" watchObservedRunningTime="2025-09-30 13:53:53.860499426 +0000 UTC m=+884.244501727" Sep 30 13:53:54 crc kubenswrapper[4936]: I0930 13:53:54.687941 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-sxfzz" Sep 30 13:53:55 crc kubenswrapper[4936]: I0930 13:53:55.138022 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9jjnj" Sep 30 13:53:55 crc kubenswrapper[4936]: I0930 13:53:55.155428 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-n28ht" Sep 30 13:53:55 crc kubenswrapper[4936]: I0930 13:53:55.349905 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-hgdlx" Sep 30 13:53:55 crc kubenswrapper[4936]: I0930 13:53:55.403703 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kldb" Sep 30 13:53:55 crc kubenswrapper[4936]: I0930 13:53:55.432362 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-zvzfj" Sep 30 13:54:01 crc kubenswrapper[4936]: E0930 13:54:01.318597 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-d2tm5" podUID="639a60da-010a-40d5-bfec-6219ef3f712b" Sep 30 13:54:03 crc kubenswrapper[4936]: E0930 13:54:03.319163 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" podUID="7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6" Sep 30 13:54:04 crc kubenswrapper[4936]: E0930 13:54:04.317201 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" podUID="b1538e13-4b0e-4bb9-9277-3d0475cd41a4" Sep 30 13:54:04 crc kubenswrapper[4936]: I0930 13:54:04.900603 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-6c8cz" Sep 30 13:54:04 crc kubenswrapper[4936]: I0930 13:54:04.948716 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-xz9zg" Sep 30 13:54:05 crc kubenswrapper[4936]: I0930 13:54:05.005851 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fk2zk" Sep 30 13:54:05 crc kubenswrapper[4936]: I0930 13:54:05.037176 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-g5k7h" Sep 30 13:54:05 crc kubenswrapper[4936]: I0930 13:54:05.576690 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-sthwj" Sep 30 13:54:05 crc kubenswrapper[4936]: I0930 13:54:05.727468 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-sxfzz" Sep 30 13:54:05 crc kubenswrapper[4936]: I0930 13:54:05.771454 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-n6mv9" Sep 30 13:54:05 crc kubenswrapper[4936]: I0930 13:54:05.907989 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-f66b554c6-x2csq" Sep 30 13:54:06 crc kubenswrapper[4936]: I0930 13:54:06.767780 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-7rfmc" event={"ID":"78f55939-d8fc-40d5-bc8e-a3f87b962b34","Type":"ContainerStarted","Data":"0bd11669f9f659a1e74aba41401d8916d350c01e5a8c1a147138f7ec3a4cca8a"} Sep 30 13:54:06 crc kubenswrapper[4936]: I0930 13:54:06.768186 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-7rfmc" Sep 30 13:54:06 crc kubenswrapper[4936]: I0930 13:54:06.769637 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-49bq2" event={"ID":"d93cfe4f-caf4-4b23-9d9c-0aa14cb5bc28","Type":"ContainerStarted","Data":"cb8e499169488344b40eebe74a20189c568bbfbd44ef78526abd63fc962f8370"} Sep 30 13:54:06 crc kubenswrapper[4936]: I0930 13:54:06.771701 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-lhbld" event={"ID":"d50d2534-deec-4173-a73a-d10b3beac452","Type":"ContainerStarted","Data":"7e72affe5725320b5bfebd016514980897132bb4f5e9d76ed89ab676be92cf0b"} Sep 30 13:54:06 crc kubenswrapper[4936]: I0930 13:54:06.771869 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-lhbld" Sep 30 13:54:06 crc kubenswrapper[4936]: I0930 13:54:06.786391 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-7rfmc" podStartSLOduration=3.733089912 podStartE2EDuration="51.786375591s" podCreationTimestamp="2025-09-30 13:53:15 +0000 UTC" firstStartedPulling="2025-09-30 13:53:17.795389908 +0000 UTC m=+848.179392219" lastFinishedPulling="2025-09-30 13:54:05.848675597 +0000 UTC m=+896.232677898" observedRunningTime="2025-09-30 13:54:06.783817269 +0000 UTC m=+897.167819570" watchObservedRunningTime="2025-09-30 13:54:06.786375591 +0000 UTC m=+897.170377892" Sep 30 13:54:06 crc kubenswrapper[4936]: I0930 13:54:06.834910 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-lhbld" podStartSLOduration=3.800968241 podStartE2EDuration="51.834892399s" podCreationTimestamp="2025-09-30 13:53:15 +0000 UTC" firstStartedPulling="2025-09-30 13:53:17.957013798 +0000 UTC m=+848.341016099" lastFinishedPulling="2025-09-30 13:54:05.990937966 +0000 UTC m=+896.374940257" observedRunningTime="2025-09-30 13:54:06.816708983 +0000 UTC m=+897.200711284" watchObservedRunningTime="2025-09-30 13:54:06.834892399 +0000 UTC m=+897.218894700" Sep 30 13:54:06 crc kubenswrapper[4936]: I0930 13:54:06.835752 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-49bq2" podStartSLOduration=3.042580531 podStartE2EDuration="50.835745703s" podCreationTimestamp="2025-09-30 13:53:16 +0000 UTC" firstStartedPulling="2025-09-30 13:53:18.277702743 +0000 UTC m=+848.661705044" lastFinishedPulling="2025-09-30 13:54:06.070867915 +0000 UTC m=+896.454870216" observedRunningTime="2025-09-30 13:54:06.831387859 +0000 UTC m=+897.215390160" watchObservedRunningTime="2025-09-30 13:54:06.835745703 +0000 UTC m=+897.219748004" Sep 30 13:54:07 crc kubenswrapper[4936]: I0930 13:54:07.788002 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-x5g5r" event={"ID":"650ff8e9-279f-41ff-8bb8-1880e7cf985c","Type":"ContainerStarted","Data":"53f2add97795c9b08296602aed194f03ae1dd476909879ee9c0ed4fd903494ac"} Sep 30 13:54:07 crc kubenswrapper[4936]: I0930 13:54:07.788261 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-x5g5r" Sep 30 13:54:08 crc kubenswrapper[4936]: I0930 13:54:08.794874 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8tpqq" event={"ID":"847b4871-2d23-4790-b32a-b42698008fee","Type":"ContainerStarted","Data":"06a910ab4cf10222734078957e87ba6e1f8b80e147e49af1a98620a56b6d9580"} Sep 30 13:54:08 crc kubenswrapper[4936]: I0930 13:54:08.795382 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8tpqq" Sep 30 13:54:08 crc kubenswrapper[4936]: I0930 13:54:08.797705 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-sxftv" event={"ID":"d08bae3c-64f1-46de-ab2c-d6b2407c2d95","Type":"ContainerStarted","Data":"a496ad740666f9ec101e311a7e1d8c298c395ecfa886df4eb76a472e8c85ec02"} Sep 30 13:54:08 crc kubenswrapper[4936]: I0930 13:54:08.813868 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-x5g5r" podStartSLOduration=4.386354429 podStartE2EDuration="54.813848381s" podCreationTimestamp="2025-09-30 13:53:14 +0000 UTC" firstStartedPulling="2025-09-30 13:53:16.339909741 +0000 UTC m=+846.723912052" lastFinishedPulling="2025-09-30 13:54:06.767403703 +0000 UTC m=+897.151406004" observedRunningTime="2025-09-30 13:54:07.805960572 +0000 UTC m=+898.189962883" watchObservedRunningTime="2025-09-30 13:54:08.813848381 +0000 UTC m=+899.197850682" Sep 30 13:54:08 crc kubenswrapper[4936]: I0930 13:54:08.814031 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8tpqq" podStartSLOduration=4.853218706 podStartE2EDuration="54.814026766s" podCreationTimestamp="2025-09-30 13:53:14 +0000 UTC" firstStartedPulling="2025-09-30 13:53:17.814411669 +0000 UTC m=+848.198413980" lastFinishedPulling="2025-09-30 13:54:07.775219749 +0000 UTC m=+898.159222040" observedRunningTime="2025-09-30 13:54:08.810067843 +0000 UTC m=+899.194070144" watchObservedRunningTime="2025-09-30 13:54:08.814026766 +0000 UTC m=+899.198029067" Sep 30 13:54:08 crc kubenswrapper[4936]: I0930 13:54:08.824578 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-sxftv" podStartSLOduration=4.336202795 podStartE2EDuration="54.824559145s" podCreationTimestamp="2025-09-30 13:53:14 +0000 UTC" firstStartedPulling="2025-09-30 13:53:17.288782104 +0000 UTC m=+847.672784405" lastFinishedPulling="2025-09-30 13:54:07.777138454 +0000 UTC m=+898.161140755" observedRunningTime="2025-09-30 13:54:08.823004291 +0000 UTC m=+899.207006592" watchObservedRunningTime="2025-09-30 13:54:08.824559145 +0000 UTC m=+899.208561446" Sep 30 13:54:14 crc kubenswrapper[4936]: I0930 13:54:14.876434 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-x5g5r" Sep 30 13:54:15 crc kubenswrapper[4936]: I0930 13:54:15.292286 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-sxftv" Sep 30 13:54:15 crc kubenswrapper[4936]: I0930 13:54:15.295260 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-sxftv" Sep 30 13:54:15 crc kubenswrapper[4936]: I0930 13:54:15.375168 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8tpqq" Sep 30 13:54:15 crc kubenswrapper[4936]: I0930 13:54:15.520820 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-lhbld" Sep 30 13:54:15 crc kubenswrapper[4936]: I0930 13:54:15.841482 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-d2tm5" event={"ID":"639a60da-010a-40d5-bfec-6219ef3f712b","Type":"ContainerStarted","Data":"2571a560df02704c6d904f3b83c356973cc2615c6ac520b0c40b28adf90f3544"} Sep 30 13:54:15 crc kubenswrapper[4936]: I0930 13:54:15.841713 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-d2tm5" Sep 30 13:54:15 crc kubenswrapper[4936]: I0930 13:54:15.885761 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-7rfmc" Sep 30 13:54:15 crc kubenswrapper[4936]: I0930 13:54:15.905743 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-d2tm5" podStartSLOduration=3.793965852 podStartE2EDuration="1m0.905726621s" podCreationTimestamp="2025-09-30 13:53:15 +0000 UTC" firstStartedPulling="2025-09-30 13:53:17.82150518 +0000 UTC m=+848.205507471" lastFinishedPulling="2025-09-30 13:54:14.933265939 +0000 UTC m=+905.317268240" observedRunningTime="2025-09-30 13:54:15.859054036 +0000 UTC m=+906.243056357" watchObservedRunningTime="2025-09-30 13:54:15.905726621 +0000 UTC m=+906.289728912" Sep 30 13:54:17 crc kubenswrapper[4936]: I0930 13:54:17.859673 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" event={"ID":"7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6","Type":"ContainerStarted","Data":"1680a991bf8fedae72f54a3ac5f075d0f95edde7eae5b8194025cec89082022e"} Sep 30 13:54:17 crc kubenswrapper[4936]: I0930 13:54:17.860410 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" Sep 30 13:54:17 crc kubenswrapper[4936]: I0930 13:54:17.878147 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" podStartSLOduration=5.181156357 podStartE2EDuration="1m3.878122226s" podCreationTimestamp="2025-09-30 13:53:14 +0000 UTC" firstStartedPulling="2025-09-30 13:53:18.053809396 +0000 UTC m=+848.437811697" lastFinishedPulling="2025-09-30 13:54:16.750775265 +0000 UTC m=+907.134777566" observedRunningTime="2025-09-30 13:54:17.877485148 +0000 UTC m=+908.261487469" watchObservedRunningTime="2025-09-30 13:54:17.878122226 +0000 UTC m=+908.262124537" Sep 30 13:54:18 crc kubenswrapper[4936]: I0930 13:54:18.250562 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:54:18 crc kubenswrapper[4936]: I0930 13:54:18.250622 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:54:18 crc kubenswrapper[4936]: I0930 13:54:18.868926 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" event={"ID":"b1538e13-4b0e-4bb9-9277-3d0475cd41a4","Type":"ContainerStarted","Data":"14d5e38a0510e51b3eecbddee1aa29acbfeaa87bc626f7afd24d571532b74b2c"} Sep 30 13:54:18 crc kubenswrapper[4936]: I0930 13:54:18.869717 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" Sep 30 13:54:18 crc kubenswrapper[4936]: I0930 13:54:18.898421 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" podStartSLOduration=4.360493259 podStartE2EDuration="1m3.898398887s" podCreationTimestamp="2025-09-30 13:53:15 +0000 UTC" firstStartedPulling="2025-09-30 13:53:18.410634408 +0000 UTC m=+848.794636709" lastFinishedPulling="2025-09-30 13:54:17.948540046 +0000 UTC m=+908.332542337" observedRunningTime="2025-09-30 13:54:18.894492186 +0000 UTC m=+909.278494607" watchObservedRunningTime="2025-09-30 13:54:18.898398887 +0000 UTC m=+909.282401188" Sep 30 13:54:25 crc kubenswrapper[4936]: I0930 13:54:25.974976 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-d2tm5" Sep 30 13:54:26 crc kubenswrapper[4936]: I0930 13:54:26.590817 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-hwpcz" Sep 30 13:54:27 crc kubenswrapper[4936]: I0930 13:54:27.127639 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-5mdhl" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.247652 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pngvg"] Sep 30 13:54:45 crc kubenswrapper[4936]: E0930 13:54:45.248619 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2866263-7d1e-47c3-8f8f-25bb43505e5e" containerName="registry-server" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.248634 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2866263-7d1e-47c3-8f8f-25bb43505e5e" containerName="registry-server" Sep 30 13:54:45 crc kubenswrapper[4936]: E0930 13:54:45.248658 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2866263-7d1e-47c3-8f8f-25bb43505e5e" containerName="extract-utilities" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.248666 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2866263-7d1e-47c3-8f8f-25bb43505e5e" containerName="extract-utilities" Sep 30 13:54:45 crc kubenswrapper[4936]: E0930 13:54:45.248695 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2866263-7d1e-47c3-8f8f-25bb43505e5e" containerName="extract-content" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.248701 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2866263-7d1e-47c3-8f8f-25bb43505e5e" containerName="extract-content" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.248846 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2866263-7d1e-47c3-8f8f-25bb43505e5e" containerName="registry-server" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.249615 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-pngvg" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.252795 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.253548 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.255093 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.263697 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-rkggj" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.269587 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pngvg"] Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.287257 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68a0ad3f-c240-417f-ae00-81cc1d640696-config\") pod \"dnsmasq-dns-675f4bcbfc-pngvg\" (UID: \"68a0ad3f-c240-417f-ae00-81cc1d640696\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pngvg" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.287743 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7xz9\" (UniqueName: \"kubernetes.io/projected/68a0ad3f-c240-417f-ae00-81cc1d640696-kube-api-access-t7xz9\") pod \"dnsmasq-dns-675f4bcbfc-pngvg\" (UID: \"68a0ad3f-c240-417f-ae00-81cc1d640696\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pngvg" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.317179 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gw8l5"] Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.318385 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-gw8l5" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.322214 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.341392 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gw8l5"] Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.389124 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68a0ad3f-c240-417f-ae00-81cc1d640696-config\") pod \"dnsmasq-dns-675f4bcbfc-pngvg\" (UID: \"68a0ad3f-c240-417f-ae00-81cc1d640696\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pngvg" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.389233 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7xz9\" (UniqueName: \"kubernetes.io/projected/68a0ad3f-c240-417f-ae00-81cc1d640696-kube-api-access-t7xz9\") pod \"dnsmasq-dns-675f4bcbfc-pngvg\" (UID: \"68a0ad3f-c240-417f-ae00-81cc1d640696\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pngvg" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.389267 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9bfr\" (UniqueName: \"kubernetes.io/projected/ef87a30a-fcc3-4259-b917-a0975285ddce-kube-api-access-d9bfr\") pod \"dnsmasq-dns-78dd6ddcc-gw8l5\" (UID: \"ef87a30a-fcc3-4259-b917-a0975285ddce\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gw8l5" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.389329 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef87a30a-fcc3-4259-b917-a0975285ddce-config\") pod \"dnsmasq-dns-78dd6ddcc-gw8l5\" (UID: \"ef87a30a-fcc3-4259-b917-a0975285ddce\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gw8l5" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.389418 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef87a30a-fcc3-4259-b917-a0975285ddce-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-gw8l5\" (UID: \"ef87a30a-fcc3-4259-b917-a0975285ddce\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gw8l5" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.390306 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68a0ad3f-c240-417f-ae00-81cc1d640696-config\") pod \"dnsmasq-dns-675f4bcbfc-pngvg\" (UID: \"68a0ad3f-c240-417f-ae00-81cc1d640696\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pngvg" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.422402 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7xz9\" (UniqueName: \"kubernetes.io/projected/68a0ad3f-c240-417f-ae00-81cc1d640696-kube-api-access-t7xz9\") pod \"dnsmasq-dns-675f4bcbfc-pngvg\" (UID: \"68a0ad3f-c240-417f-ae00-81cc1d640696\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pngvg" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.490917 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9bfr\" (UniqueName: \"kubernetes.io/projected/ef87a30a-fcc3-4259-b917-a0975285ddce-kube-api-access-d9bfr\") pod \"dnsmasq-dns-78dd6ddcc-gw8l5\" (UID: \"ef87a30a-fcc3-4259-b917-a0975285ddce\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gw8l5" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.491427 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef87a30a-fcc3-4259-b917-a0975285ddce-config\") pod \"dnsmasq-dns-78dd6ddcc-gw8l5\" (UID: \"ef87a30a-fcc3-4259-b917-a0975285ddce\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gw8l5" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.491581 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef87a30a-fcc3-4259-b917-a0975285ddce-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-gw8l5\" (UID: \"ef87a30a-fcc3-4259-b917-a0975285ddce\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gw8l5" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.492414 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef87a30a-fcc3-4259-b917-a0975285ddce-config\") pod \"dnsmasq-dns-78dd6ddcc-gw8l5\" (UID: \"ef87a30a-fcc3-4259-b917-a0975285ddce\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gw8l5" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.498308 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef87a30a-fcc3-4259-b917-a0975285ddce-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-gw8l5\" (UID: \"ef87a30a-fcc3-4259-b917-a0975285ddce\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gw8l5" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.508075 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9bfr\" (UniqueName: \"kubernetes.io/projected/ef87a30a-fcc3-4259-b917-a0975285ddce-kube-api-access-d9bfr\") pod \"dnsmasq-dns-78dd6ddcc-gw8l5\" (UID: \"ef87a30a-fcc3-4259-b917-a0975285ddce\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gw8l5" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.581461 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-pngvg" Sep 30 13:54:45 crc kubenswrapper[4936]: I0930 13:54:45.649070 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-gw8l5" Sep 30 13:54:46 crc kubenswrapper[4936]: I0930 13:54:46.120761 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pngvg"] Sep 30 13:54:46 crc kubenswrapper[4936]: I0930 13:54:46.122881 4936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 13:54:46 crc kubenswrapper[4936]: I0930 13:54:46.187072 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gw8l5"] Sep 30 13:54:46 crc kubenswrapper[4936]: W0930 13:54:46.188927 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef87a30a_fcc3_4259_b917_a0975285ddce.slice/crio-8db6f33db8a41f95549d18d9393a33182ba40d52cf36798c01810b0219b08d0e WatchSource:0}: Error finding container 8db6f33db8a41f95549d18d9393a33182ba40d52cf36798c01810b0219b08d0e: Status 404 returned error can't find the container with id 8db6f33db8a41f95549d18d9393a33182ba40d52cf36798c01810b0219b08d0e Sep 30 13:54:47 crc kubenswrapper[4936]: I0930 13:54:47.068206 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-pngvg" event={"ID":"68a0ad3f-c240-417f-ae00-81cc1d640696","Type":"ContainerStarted","Data":"22ca357c4cee0a1e0c8d77a8e9de1f95c94d4f2377bde24c58965e61111866f4"} Sep 30 13:54:47 crc kubenswrapper[4936]: I0930 13:54:47.071702 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-gw8l5" event={"ID":"ef87a30a-fcc3-4259-b917-a0975285ddce","Type":"ContainerStarted","Data":"8db6f33db8a41f95549d18d9393a33182ba40d52cf36798c01810b0219b08d0e"} Sep 30 13:54:47 crc kubenswrapper[4936]: I0930 13:54:47.992469 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pngvg"] Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.026591 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2rq2m"] Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.027770 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.048176 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2rq2m"] Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.139427 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-448wr\" (UniqueName: \"kubernetes.io/projected/bfacaf90-7d52-4568-a210-9c2016c7f5cb-kube-api-access-448wr\") pod \"dnsmasq-dns-666b6646f7-2rq2m\" (UID: \"bfacaf90-7d52-4568-a210-9c2016c7f5cb\") " pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.139513 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfacaf90-7d52-4568-a210-9c2016c7f5cb-dns-svc\") pod \"dnsmasq-dns-666b6646f7-2rq2m\" (UID: \"bfacaf90-7d52-4568-a210-9c2016c7f5cb\") " pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.139551 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfacaf90-7d52-4568-a210-9c2016c7f5cb-config\") pod \"dnsmasq-dns-666b6646f7-2rq2m\" (UID: \"bfacaf90-7d52-4568-a210-9c2016c7f5cb\") " pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.240689 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-448wr\" (UniqueName: \"kubernetes.io/projected/bfacaf90-7d52-4568-a210-9c2016c7f5cb-kube-api-access-448wr\") pod \"dnsmasq-dns-666b6646f7-2rq2m\" (UID: \"bfacaf90-7d52-4568-a210-9c2016c7f5cb\") " pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.241069 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfacaf90-7d52-4568-a210-9c2016c7f5cb-dns-svc\") pod \"dnsmasq-dns-666b6646f7-2rq2m\" (UID: \"bfacaf90-7d52-4568-a210-9c2016c7f5cb\") " pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.241116 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfacaf90-7d52-4568-a210-9c2016c7f5cb-config\") pod \"dnsmasq-dns-666b6646f7-2rq2m\" (UID: \"bfacaf90-7d52-4568-a210-9c2016c7f5cb\") " pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.242370 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfacaf90-7d52-4568-a210-9c2016c7f5cb-dns-svc\") pod \"dnsmasq-dns-666b6646f7-2rq2m\" (UID: \"bfacaf90-7d52-4568-a210-9c2016c7f5cb\") " pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.242460 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfacaf90-7d52-4568-a210-9c2016c7f5cb-config\") pod \"dnsmasq-dns-666b6646f7-2rq2m\" (UID: \"bfacaf90-7d52-4568-a210-9c2016c7f5cb\") " pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.250415 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.250470 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.281763 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-448wr\" (UniqueName: \"kubernetes.io/projected/bfacaf90-7d52-4568-a210-9c2016c7f5cb-kube-api-access-448wr\") pod \"dnsmasq-dns-666b6646f7-2rq2m\" (UID: \"bfacaf90-7d52-4568-a210-9c2016c7f5cb\") " pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.398734 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.427039 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gw8l5"] Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.481106 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rpth5"] Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.483452 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.502698 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rpth5"] Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.553677 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nckjr\" (UniqueName: \"kubernetes.io/projected/da6fa256-04f4-4d38-a47b-d006a419ca5a-kube-api-access-nckjr\") pod \"dnsmasq-dns-57d769cc4f-rpth5\" (UID: \"da6fa256-04f4-4d38-a47b-d006a419ca5a\") " pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.553778 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da6fa256-04f4-4d38-a47b-d006a419ca5a-config\") pod \"dnsmasq-dns-57d769cc4f-rpth5\" (UID: \"da6fa256-04f4-4d38-a47b-d006a419ca5a\") " pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.553846 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da6fa256-04f4-4d38-a47b-d006a419ca5a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rpth5\" (UID: \"da6fa256-04f4-4d38-a47b-d006a419ca5a\") " pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.659040 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nckjr\" (UniqueName: \"kubernetes.io/projected/da6fa256-04f4-4d38-a47b-d006a419ca5a-kube-api-access-nckjr\") pod \"dnsmasq-dns-57d769cc4f-rpth5\" (UID: \"da6fa256-04f4-4d38-a47b-d006a419ca5a\") " pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.659394 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da6fa256-04f4-4d38-a47b-d006a419ca5a-config\") pod \"dnsmasq-dns-57d769cc4f-rpth5\" (UID: \"da6fa256-04f4-4d38-a47b-d006a419ca5a\") " pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.659435 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da6fa256-04f4-4d38-a47b-d006a419ca5a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rpth5\" (UID: \"da6fa256-04f4-4d38-a47b-d006a419ca5a\") " pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.660658 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da6fa256-04f4-4d38-a47b-d006a419ca5a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rpth5\" (UID: \"da6fa256-04f4-4d38-a47b-d006a419ca5a\") " pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.661352 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da6fa256-04f4-4d38-a47b-d006a419ca5a-config\") pod \"dnsmasq-dns-57d769cc4f-rpth5\" (UID: \"da6fa256-04f4-4d38-a47b-d006a419ca5a\") " pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.703412 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nckjr\" (UniqueName: \"kubernetes.io/projected/da6fa256-04f4-4d38-a47b-d006a419ca5a-kube-api-access-nckjr\") pod \"dnsmasq-dns-57d769cc4f-rpth5\" (UID: \"da6fa256-04f4-4d38-a47b-d006a419ca5a\") " pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" Sep 30 13:54:48 crc kubenswrapper[4936]: I0930 13:54:48.871764 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.084324 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2rq2m"] Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.255093 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.259095 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.269067 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.269318 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.269541 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.269677 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.269903 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.270115 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6llk4" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.271524 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.278102 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.373983 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.374070 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rndm8\" (UniqueName: \"kubernetes.io/projected/bf1fd592-e9a1-4f76-af38-961560e7b6f4-kube-api-access-rndm8\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.374116 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.374167 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.374199 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf1fd592-e9a1-4f76-af38-961560e7b6f4-config-data\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.378455 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.378578 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf1fd592-e9a1-4f76-af38-961560e7b6f4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.378615 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf1fd592-e9a1-4f76-af38-961560e7b6f4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.378639 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.378789 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf1fd592-e9a1-4f76-af38-961560e7b6f4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.378909 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf1fd592-e9a1-4f76-af38-961560e7b6f4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.403809 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rpth5"] Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.480774 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rndm8\" (UniqueName: \"kubernetes.io/projected/bf1fd592-e9a1-4f76-af38-961560e7b6f4-kube-api-access-rndm8\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.480813 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.480840 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.480867 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.480896 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf1fd592-e9a1-4f76-af38-961560e7b6f4-config-data\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.480918 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.480946 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf1fd592-e9a1-4f76-af38-961560e7b6f4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.480966 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.481000 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf1fd592-e9a1-4f76-af38-961560e7b6f4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.481029 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf1fd592-e9a1-4f76-af38-961560e7b6f4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.481058 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf1fd592-e9a1-4f76-af38-961560e7b6f4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.481932 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.481951 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf1fd592-e9a1-4f76-af38-961560e7b6f4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.482622 4936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.483449 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf1fd592-e9a1-4f76-af38-961560e7b6f4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.484191 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf1fd592-e9a1-4f76-af38-961560e7b6f4-config-data\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.486690 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.489117 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf1fd592-e9a1-4f76-af38-961560e7b6f4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.489315 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf1fd592-e9a1-4f76-af38-961560e7b6f4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.489391 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.498523 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.498789 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rndm8\" (UniqueName: \"kubernetes.io/projected/bf1fd592-e9a1-4f76-af38-961560e7b6f4-kube-api-access-rndm8\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.528416 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.618936 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.629710 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.630943 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.640091 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.640361 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.640515 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.640643 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-w7g9k" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.640693 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.640845 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.640857 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.671786 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.789548 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/22002396-4cfa-4e41-95c0-61672072faa0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.789631 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/22002396-4cfa-4e41-95c0-61672072faa0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.789668 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8dr7\" (UniqueName: \"kubernetes.io/projected/22002396-4cfa-4e41-95c0-61672072faa0-kube-api-access-v8dr7\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.789739 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.789796 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22002396-4cfa-4e41-95c0-61672072faa0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.789822 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.789855 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.799234 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.799314 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.799437 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/22002396-4cfa-4e41-95c0-61672072faa0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.799475 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/22002396-4cfa-4e41-95c0-61672072faa0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.901067 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.901404 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.901435 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.901460 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/22002396-4cfa-4e41-95c0-61672072faa0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.901478 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/22002396-4cfa-4e41-95c0-61672072faa0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.901523 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/22002396-4cfa-4e41-95c0-61672072faa0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.901542 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/22002396-4cfa-4e41-95c0-61672072faa0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.901564 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8dr7\" (UniqueName: \"kubernetes.io/projected/22002396-4cfa-4e41-95c0-61672072faa0-kube-api-access-v8dr7\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.901594 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.901627 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.901643 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22002396-4cfa-4e41-95c0-61672072faa0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.902719 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22002396-4cfa-4e41-95c0-61672072faa0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.902980 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.903189 4936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.909475 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/22002396-4cfa-4e41-95c0-61672072faa0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.910102 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/22002396-4cfa-4e41-95c0-61672072faa0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.916719 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.917080 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/22002396-4cfa-4e41-95c0-61672072faa0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.925222 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/22002396-4cfa-4e41-95c0-61672072faa0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.929272 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.931217 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.934510 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8dr7\" (UniqueName: \"kubernetes.io/projected/22002396-4cfa-4e41-95c0-61672072faa0-kube-api-access-v8dr7\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.942865 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:49 crc kubenswrapper[4936]: I0930 13:54:49.975704 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:54:50 crc kubenswrapper[4936]: I0930 13:54:50.167315 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" event={"ID":"da6fa256-04f4-4d38-a47b-d006a419ca5a","Type":"ContainerStarted","Data":"783c4de4277dadb2faf2586499bbb3b3e4a1bd03066aaae1ac68fa985aa8ce38"} Sep 30 13:54:50 crc kubenswrapper[4936]: I0930 13:54:50.168347 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" event={"ID":"bfacaf90-7d52-4568-a210-9c2016c7f5cb","Type":"ContainerStarted","Data":"c62ea0b3936d662cfad5629c550b9e53886bb357f28aec09556704398de6fe0a"} Sep 30 13:54:50 crc kubenswrapper[4936]: I0930 13:54:50.204344 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 13:54:50 crc kubenswrapper[4936]: I0930 13:54:50.578024 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 13:54:50 crc kubenswrapper[4936]: W0930 13:54:50.653890 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22002396_4cfa_4e41_95c0_61672072faa0.slice/crio-14dd3d3e163f55173611e09f1605703763195171e7e3d2c42c90cfb6cbf7c5dc WatchSource:0}: Error finding container 14dd3d3e163f55173611e09f1605703763195171e7e3d2c42c90cfb6cbf7c5dc: Status 404 returned error can't find the container with id 14dd3d3e163f55173611e09f1605703763195171e7e3d2c42c90cfb6cbf7c5dc Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.156119 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.159050 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.167584 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.167963 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.168189 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.168419 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-qhqxb" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.168934 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.184216 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.189139 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bf1fd592-e9a1-4f76-af38-961560e7b6f4","Type":"ContainerStarted","Data":"dbc4eacb64c32abb45fe3aa87db4b771d654ca2ccc207750e6c69e96b637a188"} Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.194848 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"22002396-4cfa-4e41-95c0-61672072faa0","Type":"ContainerStarted","Data":"14dd3d3e163f55173611e09f1605703763195171e7e3d2c42c90cfb6cbf7c5dc"} Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.209716 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.260624 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4205821a-580b-4f4c-9e89-9fa6aae93378-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.260728 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4205821a-580b-4f4c-9e89-9fa6aae93378-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.260895 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4205821a-580b-4f4c-9e89-9fa6aae93378-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.260939 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4205821a-580b-4f4c-9e89-9fa6aae93378-kolla-config\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.261579 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4205821a-580b-4f4c-9e89-9fa6aae93378-config-data-default\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.261624 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.261643 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4205821a-580b-4f4c-9e89-9fa6aae93378-secrets\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.261691 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw2x7\" (UniqueName: \"kubernetes.io/projected/4205821a-580b-4f4c-9e89-9fa6aae93378-kube-api-access-qw2x7\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.261735 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4205821a-580b-4f4c-9e89-9fa6aae93378-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.363010 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4205821a-580b-4f4c-9e89-9fa6aae93378-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.363093 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4205821a-580b-4f4c-9e89-9fa6aae93378-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.363175 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4205821a-580b-4f4c-9e89-9fa6aae93378-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.363197 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4205821a-580b-4f4c-9e89-9fa6aae93378-kolla-config\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.363274 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4205821a-580b-4f4c-9e89-9fa6aae93378-config-data-default\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.363292 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.363324 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4205821a-580b-4f4c-9e89-9fa6aae93378-secrets\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.363366 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw2x7\" (UniqueName: \"kubernetes.io/projected/4205821a-580b-4f4c-9e89-9fa6aae93378-kube-api-access-qw2x7\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.363386 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4205821a-580b-4f4c-9e89-9fa6aae93378-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.364105 4936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.365106 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4205821a-580b-4f4c-9e89-9fa6aae93378-config-data-default\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.365785 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4205821a-580b-4f4c-9e89-9fa6aae93378-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.366586 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4205821a-580b-4f4c-9e89-9fa6aae93378-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.367439 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4205821a-580b-4f4c-9e89-9fa6aae93378-kolla-config\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.392142 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw2x7\" (UniqueName: \"kubernetes.io/projected/4205821a-580b-4f4c-9e89-9fa6aae93378-kube-api-access-qw2x7\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.424917 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4205821a-580b-4f4c-9e89-9fa6aae93378-secrets\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.425398 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4205821a-580b-4f4c-9e89-9fa6aae93378-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.427489 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4205821a-580b-4f4c-9e89-9fa6aae93378-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.443138 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"4205821a-580b-4f4c-9e89-9fa6aae93378\") " pod="openstack/openstack-galera-0" Sep 30 13:54:51 crc kubenswrapper[4936]: I0930 13:54:51.483877 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.160041 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.356122 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.358895 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.359003 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.367071 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.367474 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-svrfq" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.368485 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.369058 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.522994 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2544e332-54a0-46cc-8077-417e83eed982-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.523388 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2544e332-54a0-46cc-8077-417e83eed982-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.523410 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2544e332-54a0-46cc-8077-417e83eed982-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.523436 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.523465 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2544e332-54a0-46cc-8077-417e83eed982-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.523485 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2544e332-54a0-46cc-8077-417e83eed982-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.523510 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2544e332-54a0-46cc-8077-417e83eed982-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.523548 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2544e332-54a0-46cc-8077-417e83eed982-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.523582 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2whl\" (UniqueName: \"kubernetes.io/projected/2544e332-54a0-46cc-8077-417e83eed982-kube-api-access-p2whl\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.625679 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2whl\" (UniqueName: \"kubernetes.io/projected/2544e332-54a0-46cc-8077-417e83eed982-kube-api-access-p2whl\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.625752 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2544e332-54a0-46cc-8077-417e83eed982-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.625780 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2544e332-54a0-46cc-8077-417e83eed982-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.625803 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2544e332-54a0-46cc-8077-417e83eed982-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.625825 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.625847 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2544e332-54a0-46cc-8077-417e83eed982-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.625867 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2544e332-54a0-46cc-8077-417e83eed982-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.625889 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2544e332-54a0-46cc-8077-417e83eed982-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.625925 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2544e332-54a0-46cc-8077-417e83eed982-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.627993 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2544e332-54a0-46cc-8077-417e83eed982-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.628434 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2544e332-54a0-46cc-8077-417e83eed982-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.629422 4936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.633229 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2544e332-54a0-46cc-8077-417e83eed982-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.634161 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2544e332-54a0-46cc-8077-417e83eed982-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.634282 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2544e332-54a0-46cc-8077-417e83eed982-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.654170 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2544e332-54a0-46cc-8077-417e83eed982-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.663162 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/2544e332-54a0-46cc-8077-417e83eed982-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.675154 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2whl\" (UniqueName: \"kubernetes.io/projected/2544e332-54a0-46cc-8077-417e83eed982-kube-api-access-p2whl\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.676665 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.678246 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.691699 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.692606 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-46whw" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.694187 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.767532 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2544e332-54a0-46cc-8077-417e83eed982\") " pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.773816 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.829660 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3631d52-a6a9-46fc-b109-a8e0b96bac93-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e3631d52-a6a9-46fc-b109-a8e0b96bac93\") " pod="openstack/memcached-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.829729 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3631d52-a6a9-46fc-b109-a8e0b96bac93-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e3631d52-a6a9-46fc-b109-a8e0b96bac93\") " pod="openstack/memcached-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.829786 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3631d52-a6a9-46fc-b109-a8e0b96bac93-config-data\") pod \"memcached-0\" (UID: \"e3631d52-a6a9-46fc-b109-a8e0b96bac93\") " pod="openstack/memcached-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.829868 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e3631d52-a6a9-46fc-b109-a8e0b96bac93-kolla-config\") pod \"memcached-0\" (UID: \"e3631d52-a6a9-46fc-b109-a8e0b96bac93\") " pod="openstack/memcached-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.829942 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rdqv\" (UniqueName: \"kubernetes.io/projected/e3631d52-a6a9-46fc-b109-a8e0b96bac93-kube-api-access-7rdqv\") pod \"memcached-0\" (UID: \"e3631d52-a6a9-46fc-b109-a8e0b96bac93\") " pod="openstack/memcached-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.931305 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rdqv\" (UniqueName: \"kubernetes.io/projected/e3631d52-a6a9-46fc-b109-a8e0b96bac93-kube-api-access-7rdqv\") pod \"memcached-0\" (UID: \"e3631d52-a6a9-46fc-b109-a8e0b96bac93\") " pod="openstack/memcached-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.931756 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3631d52-a6a9-46fc-b109-a8e0b96bac93-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e3631d52-a6a9-46fc-b109-a8e0b96bac93\") " pod="openstack/memcached-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.931803 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3631d52-a6a9-46fc-b109-a8e0b96bac93-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e3631d52-a6a9-46fc-b109-a8e0b96bac93\") " pod="openstack/memcached-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.932394 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3631d52-a6a9-46fc-b109-a8e0b96bac93-config-data\") pod \"memcached-0\" (UID: \"e3631d52-a6a9-46fc-b109-a8e0b96bac93\") " pod="openstack/memcached-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.932464 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e3631d52-a6a9-46fc-b109-a8e0b96bac93-kolla-config\") pod \"memcached-0\" (UID: \"e3631d52-a6a9-46fc-b109-a8e0b96bac93\") " pod="openstack/memcached-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.933781 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3631d52-a6a9-46fc-b109-a8e0b96bac93-config-data\") pod \"memcached-0\" (UID: \"e3631d52-a6a9-46fc-b109-a8e0b96bac93\") " pod="openstack/memcached-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.938684 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3631d52-a6a9-46fc-b109-a8e0b96bac93-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e3631d52-a6a9-46fc-b109-a8e0b96bac93\") " pod="openstack/memcached-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.949181 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3631d52-a6a9-46fc-b109-a8e0b96bac93-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e3631d52-a6a9-46fc-b109-a8e0b96bac93\") " pod="openstack/memcached-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.949677 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e3631d52-a6a9-46fc-b109-a8e0b96bac93-kolla-config\") pod \"memcached-0\" (UID: \"e3631d52-a6a9-46fc-b109-a8e0b96bac93\") " pod="openstack/memcached-0" Sep 30 13:54:52 crc kubenswrapper[4936]: I0930 13:54:52.959447 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rdqv\" (UniqueName: \"kubernetes.io/projected/e3631d52-a6a9-46fc-b109-a8e0b96bac93-kube-api-access-7rdqv\") pod \"memcached-0\" (UID: \"e3631d52-a6a9-46fc-b109-a8e0b96bac93\") " pod="openstack/memcached-0" Sep 30 13:54:53 crc kubenswrapper[4936]: I0930 13:54:53.069610 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 13:54:53 crc kubenswrapper[4936]: I0930 13:54:53.123644 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 13:54:53 crc kubenswrapper[4936]: I0930 13:54:53.239739 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4205821a-580b-4f4c-9e89-9fa6aae93378","Type":"ContainerStarted","Data":"a9e1622757b3834a730a0ec03bb8aa89105c68e7d5c7e7072897e7c45fb1b01d"} Sep 30 13:54:54 crc kubenswrapper[4936]: I0930 13:54:53.999730 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 13:54:54 crc kubenswrapper[4936]: I0930 13:54:54.124632 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 13:54:54 crc kubenswrapper[4936]: W0930 13:54:54.130296 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3631d52_a6a9_46fc_b109_a8e0b96bac93.slice/crio-9a86b27bb9c7d90d11b222e8053770bd24156a4e9120a05be5e730ad9a98ca1e WatchSource:0}: Error finding container 9a86b27bb9c7d90d11b222e8053770bd24156a4e9120a05be5e730ad9a98ca1e: Status 404 returned error can't find the container with id 9a86b27bb9c7d90d11b222e8053770bd24156a4e9120a05be5e730ad9a98ca1e Sep 30 13:54:54 crc kubenswrapper[4936]: I0930 13:54:54.298653 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e3631d52-a6a9-46fc-b109-a8e0b96bac93","Type":"ContainerStarted","Data":"9a86b27bb9c7d90d11b222e8053770bd24156a4e9120a05be5e730ad9a98ca1e"} Sep 30 13:54:54 crc kubenswrapper[4936]: I0930 13:54:54.302440 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2544e332-54a0-46cc-8077-417e83eed982","Type":"ContainerStarted","Data":"5366c0fdb6bd8417e29d428f64287093852a66ccd97e878f8ffc081739521a83"} Sep 30 13:54:54 crc kubenswrapper[4936]: I0930 13:54:54.578515 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:54:54 crc kubenswrapper[4936]: I0930 13:54:54.584891 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 13:54:54 crc kubenswrapper[4936]: I0930 13:54:54.590346 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:54:54 crc kubenswrapper[4936]: I0930 13:54:54.594780 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-2lj8b" Sep 30 13:54:54 crc kubenswrapper[4936]: I0930 13:54:54.677924 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82z9r\" (UniqueName: \"kubernetes.io/projected/34168aff-c364-4158-a3e2-ff82841c060c-kube-api-access-82z9r\") pod \"kube-state-metrics-0\" (UID: \"34168aff-c364-4158-a3e2-ff82841c060c\") " pod="openstack/kube-state-metrics-0" Sep 30 13:54:54 crc kubenswrapper[4936]: I0930 13:54:54.780140 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82z9r\" (UniqueName: \"kubernetes.io/projected/34168aff-c364-4158-a3e2-ff82841c060c-kube-api-access-82z9r\") pod \"kube-state-metrics-0\" (UID: \"34168aff-c364-4158-a3e2-ff82841c060c\") " pod="openstack/kube-state-metrics-0" Sep 30 13:54:54 crc kubenswrapper[4936]: I0930 13:54:54.806509 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82z9r\" (UniqueName: \"kubernetes.io/projected/34168aff-c364-4158-a3e2-ff82841c060c-kube-api-access-82z9r\") pod \"kube-state-metrics-0\" (UID: \"34168aff-c364-4158-a3e2-ff82841c060c\") " pod="openstack/kube-state-metrics-0" Sep 30 13:54:54 crc kubenswrapper[4936]: I0930 13:54:54.924715 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 13:54:55 crc kubenswrapper[4936]: I0930 13:54:55.553289 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:54:56 crc kubenswrapper[4936]: I0930 13:54:56.328171 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"34168aff-c364-4158-a3e2-ff82841c060c","Type":"ContainerStarted","Data":"60b4acd06d609350f6d20a8841904a264111ed85fb1a33dda3a1452024c7a388"} Sep 30 13:54:58 crc kubenswrapper[4936]: I0930 13:54:58.824697 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 13:54:58 crc kubenswrapper[4936]: I0930 13:54:58.826290 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:58 crc kubenswrapper[4936]: I0930 13:54:58.830044 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Sep 30 13:54:58 crc kubenswrapper[4936]: I0930 13:54:58.830129 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Sep 30 13:54:58 crc kubenswrapper[4936]: I0930 13:54:58.830433 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Sep 30 13:54:58 crc kubenswrapper[4936]: I0930 13:54:58.830491 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-2gj25" Sep 30 13:54:58 crc kubenswrapper[4936]: I0930 13:54:58.830945 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Sep 30 13:54:58 crc kubenswrapper[4936]: I0930 13:54:58.861453 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 13:54:58 crc kubenswrapper[4936]: I0930 13:54:58.986599 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2a97be5-d38b-4352-883e-1efaf06ce24e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:58 crc kubenswrapper[4936]: I0930 13:54:58.986663 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c2a97be5-d38b-4352-883e-1efaf06ce24e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:58 crc kubenswrapper[4936]: I0930 13:54:58.986701 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2a97be5-d38b-4352-883e-1efaf06ce24e-config\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:58 crc kubenswrapper[4936]: I0930 13:54:58.986796 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a97be5-d38b-4352-883e-1efaf06ce24e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:58 crc kubenswrapper[4936]: I0930 13:54:58.986816 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2a97be5-d38b-4352-883e-1efaf06ce24e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:58 crc kubenswrapper[4936]: I0930 13:54:58.986834 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2a97be5-d38b-4352-883e-1efaf06ce24e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:58 crc kubenswrapper[4936]: I0930 13:54:58.986862 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnxj2\" (UniqueName: \"kubernetes.io/projected/c2a97be5-d38b-4352-883e-1efaf06ce24e-kube-api-access-gnxj2\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:58 crc kubenswrapper[4936]: I0930 13:54:58.987092 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.063105 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-747gv"] Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.065568 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.069770 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-tcc5w" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.070080 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.071552 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-k5lgl"] Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.073067 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.080708 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.099366 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2a97be5-d38b-4352-883e-1efaf06ce24e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.099463 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c2a97be5-d38b-4352-883e-1efaf06ce24e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.099526 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2a97be5-d38b-4352-883e-1efaf06ce24e-config\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.099588 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a97be5-d38b-4352-883e-1efaf06ce24e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.099622 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2a97be5-d38b-4352-883e-1efaf06ce24e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.099646 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2a97be5-d38b-4352-883e-1efaf06ce24e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.099708 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnxj2\" (UniqueName: \"kubernetes.io/projected/c2a97be5-d38b-4352-883e-1efaf06ce24e-kube-api-access-gnxj2\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.099775 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.100156 4936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.101624 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c2a97be5-d38b-4352-883e-1efaf06ce24e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.102424 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2a97be5-d38b-4352-883e-1efaf06ce24e-config\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.102528 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-747gv"] Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.105876 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2a97be5-d38b-4352-883e-1efaf06ce24e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.124898 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2a97be5-d38b-4352-883e-1efaf06ce24e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.126095 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2a97be5-d38b-4352-883e-1efaf06ce24e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.132194 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnxj2\" (UniqueName: \"kubernetes.io/projected/c2a97be5-d38b-4352-883e-1efaf06ce24e-kube-api-access-gnxj2\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.148205 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a97be5-d38b-4352-883e-1efaf06ce24e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.153587 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-k5lgl"] Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.188899 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c2a97be5-d38b-4352-883e-1efaf06ce24e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.201291 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/491cf6ce-e945-4bd0-b811-b24eed9fcc12-var-run-ovn\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.201671 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/491cf6ce-e945-4bd0-b811-b24eed9fcc12-combined-ca-bundle\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.201719 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2654d46a-f44e-45b2-862d-55e5eda229b7-var-log\") pod \"ovn-controller-ovs-747gv\" (UID: \"2654d46a-f44e-45b2-862d-55e5eda229b7\") " pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.201746 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/491cf6ce-e945-4bd0-b811-b24eed9fcc12-scripts\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.201775 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2654d46a-f44e-45b2-862d-55e5eda229b7-var-run\") pod \"ovn-controller-ovs-747gv\" (UID: \"2654d46a-f44e-45b2-862d-55e5eda229b7\") " pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.201802 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxpbf\" (UniqueName: \"kubernetes.io/projected/2654d46a-f44e-45b2-862d-55e5eda229b7-kube-api-access-dxpbf\") pod \"ovn-controller-ovs-747gv\" (UID: \"2654d46a-f44e-45b2-862d-55e5eda229b7\") " pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.201824 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2654d46a-f44e-45b2-862d-55e5eda229b7-scripts\") pod \"ovn-controller-ovs-747gv\" (UID: \"2654d46a-f44e-45b2-862d-55e5eda229b7\") " pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.201849 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/491cf6ce-e945-4bd0-b811-b24eed9fcc12-var-run\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.201870 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t65tn\" (UniqueName: \"kubernetes.io/projected/491cf6ce-e945-4bd0-b811-b24eed9fcc12-kube-api-access-t65tn\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.201893 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2654d46a-f44e-45b2-862d-55e5eda229b7-etc-ovs\") pod \"ovn-controller-ovs-747gv\" (UID: \"2654d46a-f44e-45b2-862d-55e5eda229b7\") " pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.201933 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/491cf6ce-e945-4bd0-b811-b24eed9fcc12-var-log-ovn\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.201957 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/491cf6ce-e945-4bd0-b811-b24eed9fcc12-ovn-controller-tls-certs\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.201973 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2654d46a-f44e-45b2-862d-55e5eda229b7-var-lib\") pod \"ovn-controller-ovs-747gv\" (UID: \"2654d46a-f44e-45b2-862d-55e5eda229b7\") " pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.303781 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/491cf6ce-e945-4bd0-b811-b24eed9fcc12-var-run-ovn\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.303835 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/491cf6ce-e945-4bd0-b811-b24eed9fcc12-combined-ca-bundle\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.303861 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2654d46a-f44e-45b2-862d-55e5eda229b7-var-log\") pod \"ovn-controller-ovs-747gv\" (UID: \"2654d46a-f44e-45b2-862d-55e5eda229b7\") " pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.303883 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/491cf6ce-e945-4bd0-b811-b24eed9fcc12-scripts\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.303898 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2654d46a-f44e-45b2-862d-55e5eda229b7-var-run\") pod \"ovn-controller-ovs-747gv\" (UID: \"2654d46a-f44e-45b2-862d-55e5eda229b7\") " pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.303923 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxpbf\" (UniqueName: \"kubernetes.io/projected/2654d46a-f44e-45b2-862d-55e5eda229b7-kube-api-access-dxpbf\") pod \"ovn-controller-ovs-747gv\" (UID: \"2654d46a-f44e-45b2-862d-55e5eda229b7\") " pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.303948 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2654d46a-f44e-45b2-862d-55e5eda229b7-scripts\") pod \"ovn-controller-ovs-747gv\" (UID: \"2654d46a-f44e-45b2-862d-55e5eda229b7\") " pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.303968 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/491cf6ce-e945-4bd0-b811-b24eed9fcc12-var-run\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.303982 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t65tn\" (UniqueName: \"kubernetes.io/projected/491cf6ce-e945-4bd0-b811-b24eed9fcc12-kube-api-access-t65tn\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.303997 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2654d46a-f44e-45b2-862d-55e5eda229b7-etc-ovs\") pod \"ovn-controller-ovs-747gv\" (UID: \"2654d46a-f44e-45b2-862d-55e5eda229b7\") " pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.304030 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/491cf6ce-e945-4bd0-b811-b24eed9fcc12-var-log-ovn\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.304048 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/491cf6ce-e945-4bd0-b811-b24eed9fcc12-ovn-controller-tls-certs\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.304064 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2654d46a-f44e-45b2-862d-55e5eda229b7-var-lib\") pod \"ovn-controller-ovs-747gv\" (UID: \"2654d46a-f44e-45b2-862d-55e5eda229b7\") " pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.304651 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2654d46a-f44e-45b2-862d-55e5eda229b7-var-lib\") pod \"ovn-controller-ovs-747gv\" (UID: \"2654d46a-f44e-45b2-862d-55e5eda229b7\") " pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.305055 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/491cf6ce-e945-4bd0-b811-b24eed9fcc12-var-run-ovn\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.308530 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/491cf6ce-e945-4bd0-b811-b24eed9fcc12-combined-ca-bundle\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.308566 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/491cf6ce-e945-4bd0-b811-b24eed9fcc12-var-run\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.308739 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2654d46a-f44e-45b2-862d-55e5eda229b7-var-log\") pod \"ovn-controller-ovs-747gv\" (UID: \"2654d46a-f44e-45b2-862d-55e5eda229b7\") " pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.309685 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2654d46a-f44e-45b2-862d-55e5eda229b7-scripts\") pod \"ovn-controller-ovs-747gv\" (UID: \"2654d46a-f44e-45b2-862d-55e5eda229b7\") " pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.309853 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2654d46a-f44e-45b2-862d-55e5eda229b7-var-run\") pod \"ovn-controller-ovs-747gv\" (UID: \"2654d46a-f44e-45b2-862d-55e5eda229b7\") " pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.310042 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/491cf6ce-e945-4bd0-b811-b24eed9fcc12-var-log-ovn\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.310383 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2654d46a-f44e-45b2-862d-55e5eda229b7-etc-ovs\") pod \"ovn-controller-ovs-747gv\" (UID: \"2654d46a-f44e-45b2-862d-55e5eda229b7\") " pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.311877 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/491cf6ce-e945-4bd0-b811-b24eed9fcc12-scripts\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.315617 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/491cf6ce-e945-4bd0-b811-b24eed9fcc12-ovn-controller-tls-certs\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.325622 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxpbf\" (UniqueName: \"kubernetes.io/projected/2654d46a-f44e-45b2-862d-55e5eda229b7-kube-api-access-dxpbf\") pod \"ovn-controller-ovs-747gv\" (UID: \"2654d46a-f44e-45b2-862d-55e5eda229b7\") " pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.326950 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t65tn\" (UniqueName: \"kubernetes.io/projected/491cf6ce-e945-4bd0-b811-b24eed9fcc12-kube-api-access-t65tn\") pod \"ovn-controller-k5lgl\" (UID: \"491cf6ce-e945-4bd0-b811-b24eed9fcc12\") " pod="openstack/ovn-controller-k5lgl" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.389044 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.465868 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 13:54:59 crc kubenswrapper[4936]: I0930 13:54:59.535067 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k5lgl" Sep 30 13:55:00 crc kubenswrapper[4936]: I0930 13:55:00.971864 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-k5lgl"] Sep 30 13:55:00 crc kubenswrapper[4936]: W0930 13:55:00.999176 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod491cf6ce_e945_4bd0_b811_b24eed9fcc12.slice/crio-fc439024ac028e5d7939bb7fcc8c035ae02638eaa5e6a89bcca66517888bf6da WatchSource:0}: Error finding container fc439024ac028e5d7939bb7fcc8c035ae02638eaa5e6a89bcca66517888bf6da: Status 404 returned error can't find the container with id fc439024ac028e5d7939bb7fcc8c035ae02638eaa5e6a89bcca66517888bf6da Sep 30 13:55:01 crc kubenswrapper[4936]: I0930 13:55:01.459156 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k5lgl" event={"ID":"491cf6ce-e945-4bd0-b811-b24eed9fcc12","Type":"ContainerStarted","Data":"fc439024ac028e5d7939bb7fcc8c035ae02638eaa5e6a89bcca66517888bf6da"} Sep 30 13:55:01 crc kubenswrapper[4936]: I0930 13:55:01.939317 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 13:55:01 crc kubenswrapper[4936]: I0930 13:55:01.941549 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:01 crc kubenswrapper[4936]: I0930 13:55:01.951750 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-6mcfm" Sep 30 13:55:01 crc kubenswrapper[4936]: I0930 13:55:01.951830 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Sep 30 13:55:01 crc kubenswrapper[4936]: I0930 13:55:01.951880 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Sep 30 13:55:01 crc kubenswrapper[4936]: I0930 13:55:01.951776 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Sep 30 13:55:01 crc kubenswrapper[4936]: I0930 13:55:01.958506 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.059507 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d699954-e8fb-482d-83ea-a131998407a1-config\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.059651 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d699954-e8fb-482d-83ea-a131998407a1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.059677 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d699954-e8fb-482d-83ea-a131998407a1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.059749 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9d699954-e8fb-482d-83ea-a131998407a1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.059825 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.060169 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d699954-e8fb-482d-83ea-a131998407a1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.060190 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d699954-e8fb-482d-83ea-a131998407a1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.060206 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvdm5\" (UniqueName: \"kubernetes.io/projected/9d699954-e8fb-482d-83ea-a131998407a1-kube-api-access-wvdm5\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.161708 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d699954-e8fb-482d-83ea-a131998407a1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.161764 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d699954-e8fb-482d-83ea-a131998407a1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.162924 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d699954-e8fb-482d-83ea-a131998407a1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.161786 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvdm5\" (UniqueName: \"kubernetes.io/projected/9d699954-e8fb-482d-83ea-a131998407a1-kube-api-access-wvdm5\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.163016 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d699954-e8fb-482d-83ea-a131998407a1-config\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.163047 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d699954-e8fb-482d-83ea-a131998407a1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.163761 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d699954-e8fb-482d-83ea-a131998407a1-config\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.164286 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d699954-e8fb-482d-83ea-a131998407a1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.164772 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9d699954-e8fb-482d-83ea-a131998407a1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.164839 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9d699954-e8fb-482d-83ea-a131998407a1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.164995 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.165184 4936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.170247 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d699954-e8fb-482d-83ea-a131998407a1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.171450 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d699954-e8fb-482d-83ea-a131998407a1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.183934 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d699954-e8fb-482d-83ea-a131998407a1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.209504 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvdm5\" (UniqueName: \"kubernetes.io/projected/9d699954-e8fb-482d-83ea-a131998407a1-kube-api-access-wvdm5\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.215976 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9d699954-e8fb-482d-83ea-a131998407a1\") " pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:02 crc kubenswrapper[4936]: I0930 13:55:02.309432 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:09 crc kubenswrapper[4936]: I0930 13:55:09.251131 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 13:55:09 crc kubenswrapper[4936]: I0930 13:55:09.874447 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-747gv"] Sep 30 13:55:12 crc kubenswrapper[4936]: W0930 13:55:12.493462 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2a97be5_d38b_4352_883e_1efaf06ce24e.slice/crio-f52d8db7beadcf0704b69382f962d75a66d0a87303a1ef1dcead65d65e217f9f WatchSource:0}: Error finding container f52d8db7beadcf0704b69382f962d75a66d0a87303a1ef1dcead65d65e217f9f: Status 404 returned error can't find the container with id f52d8db7beadcf0704b69382f962d75a66d0a87303a1ef1dcead65d65e217f9f Sep 30 13:55:12 crc kubenswrapper[4936]: W0930 13:55:12.498834 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2654d46a_f44e_45b2_862d_55e5eda229b7.slice/crio-ebb0a987dfe4ced213eebd5f38e4655b254e94b443e27852e2eb6bddc496329e WatchSource:0}: Error finding container ebb0a987dfe4ced213eebd5f38e4655b254e94b443e27852e2eb6bddc496329e: Status 404 returned error can't find the container with id ebb0a987dfe4ced213eebd5f38e4655b254e94b443e27852e2eb6bddc496329e Sep 30 13:55:12 crc kubenswrapper[4936]: I0930 13:55:12.534585 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c2a97be5-d38b-4352-883e-1efaf06ce24e","Type":"ContainerStarted","Data":"f52d8db7beadcf0704b69382f962d75a66d0a87303a1ef1dcead65d65e217f9f"} Sep 30 13:55:12 crc kubenswrapper[4936]: I0930 13:55:12.536221 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-747gv" event={"ID":"2654d46a-f44e-45b2-862d-55e5eda229b7","Type":"ContainerStarted","Data":"ebb0a987dfe4ced213eebd5f38e4655b254e94b443e27852e2eb6bddc496329e"} Sep 30 13:55:17 crc kubenswrapper[4936]: E0930 13:55:17.705962 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 30 13:55:17 crc kubenswrapper[4936]: E0930 13:55:17.707095 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d9bfr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-gw8l5_openstack(ef87a30a-fcc3-4259-b917-a0975285ddce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:55:17 crc kubenswrapper[4936]: E0930 13:55:17.708738 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-gw8l5" podUID="ef87a30a-fcc3-4259-b917-a0975285ddce" Sep 30 13:55:18 crc kubenswrapper[4936]: I0930 13:55:18.250566 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:55:18 crc kubenswrapper[4936]: I0930 13:55:18.250945 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:55:18 crc kubenswrapper[4936]: I0930 13:55:18.251001 4936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 13:55:18 crc kubenswrapper[4936]: I0930 13:55:18.252326 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"48ed87deccef46c180b6a2bcdda86faafafe3195aa273e064e63d95d1f7429e4"} pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:55:18 crc kubenswrapper[4936]: I0930 13:55:18.252463 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" containerID="cri-o://48ed87deccef46c180b6a2bcdda86faafafe3195aa273e064e63d95d1f7429e4" gracePeriod=600 Sep 30 13:55:18 crc kubenswrapper[4936]: I0930 13:55:18.588132 4936 generic.go:334] "Generic (PLEG): container finished" podID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerID="48ed87deccef46c180b6a2bcdda86faafafe3195aa273e064e63d95d1f7429e4" exitCode=0 Sep 30 13:55:18 crc kubenswrapper[4936]: I0930 13:55:18.588414 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerDied","Data":"48ed87deccef46c180b6a2bcdda86faafafe3195aa273e064e63d95d1f7429e4"} Sep 30 13:55:18 crc kubenswrapper[4936]: I0930 13:55:18.588451 4936 scope.go:117] "RemoveContainer" containerID="7b27a62cf82d437a70e61d77c0bf6775c7b99f0aab2b41f8875371a920ef34f1" Sep 30 13:55:18 crc kubenswrapper[4936]: E0930 13:55:18.617383 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Sep 30 13:55:18 crc kubenswrapper[4936]: E0930 13:55:18.617583 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v8dr7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(22002396-4cfa-4e41-95c0-61672072faa0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:55:18 crc kubenswrapper[4936]: E0930 13:55:18.619051 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="22002396-4cfa-4e41-95c0-61672072faa0" Sep 30 13:55:19 crc kubenswrapper[4936]: E0930 13:55:19.599853 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="22002396-4cfa-4e41-95c0-61672072faa0" Sep 30 13:55:20 crc kubenswrapper[4936]: E0930 13:55:20.562591 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Sep 30 13:55:20 crc kubenswrapper[4936]: E0930 13:55:20.562777 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:DB_ROOT_PASSWORD,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:DbRootPassword,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:secrets,ReadOnly:true,MountPath:/var/lib/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p2whl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(2544e332-54a0-46cc-8077-417e83eed982): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:55:20 crc kubenswrapper[4936]: E0930 13:55:20.564152 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="2544e332-54a0-46cc-8077-417e83eed982" Sep 30 13:55:20 crc kubenswrapper[4936]: E0930 13:55:20.604281 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="2544e332-54a0-46cc-8077-417e83eed982" Sep 30 13:55:21 crc kubenswrapper[4936]: E0930 13:55:21.944779 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Sep 30 13:55:21 crc kubenswrapper[4936]: E0930 13:55:21.945196 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rndm8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(bf1fd592-e9a1-4f76-af38-961560e7b6f4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:55:21 crc kubenswrapper[4936]: E0930 13:55:21.946478 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="bf1fd592-e9a1-4f76-af38-961560e7b6f4" Sep 30 13:55:22 crc kubenswrapper[4936]: E0930 13:55:22.624179 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="bf1fd592-e9a1-4f76-af38-961560e7b6f4" Sep 30 13:55:22 crc kubenswrapper[4936]: E0930 13:55:22.667380 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Sep 30 13:55:22 crc kubenswrapper[4936]: E0930 13:55:22.667645 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:nd5h69h58bh648h694h5f4h5f9h676h667h5bfh59dh595h694hdbh686hd9h6h66bh54bh599h97h5cdh78h5b6h598h5ffh9h549h575h669hbch9bq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7rdqv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(e3631d52-a6a9-46fc-b109-a8e0b96bac93): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:55:22 crc kubenswrapper[4936]: E0930 13:55:22.668876 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="e3631d52-a6a9-46fc-b109-a8e0b96bac93" Sep 30 13:55:22 crc kubenswrapper[4936]: E0930 13:55:22.783952 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Sep 30 13:55:22 crc kubenswrapper[4936]: E0930 13:55:22.784138 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:DB_ROOT_PASSWORD,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:DbRootPassword,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:secrets,ReadOnly:true,MountPath:/var/lib/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qw2x7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(4205821a-580b-4f4c-9e89-9fa6aae93378): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:55:22 crc kubenswrapper[4936]: E0930 13:55:22.785316 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="4205821a-580b-4f4c-9e89-9fa6aae93378" Sep 30 13:55:23 crc kubenswrapper[4936]: E0930 13:55:23.632321 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="4205821a-580b-4f4c-9e89-9fa6aae93378" Sep 30 13:55:23 crc kubenswrapper[4936]: E0930 13:55:23.632635 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="e3631d52-a6a9-46fc-b109-a8e0b96bac93" Sep 30 13:55:24 crc kubenswrapper[4936]: I0930 13:55:24.031407 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-gw8l5" Sep 30 13:55:24 crc kubenswrapper[4936]: I0930 13:55:24.077230 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef87a30a-fcc3-4259-b917-a0975285ddce-dns-svc\") pod \"ef87a30a-fcc3-4259-b917-a0975285ddce\" (UID: \"ef87a30a-fcc3-4259-b917-a0975285ddce\") " Sep 30 13:55:24 crc kubenswrapper[4936]: I0930 13:55:24.077311 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9bfr\" (UniqueName: \"kubernetes.io/projected/ef87a30a-fcc3-4259-b917-a0975285ddce-kube-api-access-d9bfr\") pod \"ef87a30a-fcc3-4259-b917-a0975285ddce\" (UID: \"ef87a30a-fcc3-4259-b917-a0975285ddce\") " Sep 30 13:55:24 crc kubenswrapper[4936]: I0930 13:55:24.077408 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef87a30a-fcc3-4259-b917-a0975285ddce-config\") pod \"ef87a30a-fcc3-4259-b917-a0975285ddce\" (UID: \"ef87a30a-fcc3-4259-b917-a0975285ddce\") " Sep 30 13:55:24 crc kubenswrapper[4936]: I0930 13:55:24.078122 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef87a30a-fcc3-4259-b917-a0975285ddce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ef87a30a-fcc3-4259-b917-a0975285ddce" (UID: "ef87a30a-fcc3-4259-b917-a0975285ddce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:24 crc kubenswrapper[4936]: I0930 13:55:24.078998 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef87a30a-fcc3-4259-b917-a0975285ddce-config" (OuterVolumeSpecName: "config") pod "ef87a30a-fcc3-4259-b917-a0975285ddce" (UID: "ef87a30a-fcc3-4259-b917-a0975285ddce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:24 crc kubenswrapper[4936]: I0930 13:55:24.083506 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef87a30a-fcc3-4259-b917-a0975285ddce-kube-api-access-d9bfr" (OuterVolumeSpecName: "kube-api-access-d9bfr") pod "ef87a30a-fcc3-4259-b917-a0975285ddce" (UID: "ef87a30a-fcc3-4259-b917-a0975285ddce"). InnerVolumeSpecName "kube-api-access-d9bfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:24 crc kubenswrapper[4936]: I0930 13:55:24.179449 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef87a30a-fcc3-4259-b917-a0975285ddce-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:24 crc kubenswrapper[4936]: I0930 13:55:24.179577 4936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef87a30a-fcc3-4259-b917-a0975285ddce-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:24 crc kubenswrapper[4936]: I0930 13:55:24.179591 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9bfr\" (UniqueName: \"kubernetes.io/projected/ef87a30a-fcc3-4259-b917-a0975285ddce-kube-api-access-d9bfr\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:24 crc kubenswrapper[4936]: I0930 13:55:24.639142 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-gw8l5" event={"ID":"ef87a30a-fcc3-4259-b917-a0975285ddce","Type":"ContainerDied","Data":"8db6f33db8a41f95549d18d9393a33182ba40d52cf36798c01810b0219b08d0e"} Sep 30 13:55:24 crc kubenswrapper[4936]: I0930 13:55:24.639171 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-gw8l5" Sep 30 13:55:24 crc kubenswrapper[4936]: I0930 13:55:24.710322 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gw8l5"] Sep 30 13:55:24 crc kubenswrapper[4936]: I0930 13:55:24.717830 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gw8l5"] Sep 30 13:55:25 crc kubenswrapper[4936]: I0930 13:55:25.750926 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 13:55:26 crc kubenswrapper[4936]: W0930 13:55:26.301496 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d699954_e8fb_482d_83ea_a131998407a1.slice/crio-31fb0ec1b35bdee815f1a24d73f3b72c8deb52c13be75b7c841e49ac8eab21a2 WatchSource:0}: Error finding container 31fb0ec1b35bdee815f1a24d73f3b72c8deb52c13be75b7c841e49ac8eab21a2: Status 404 returned error can't find the container with id 31fb0ec1b35bdee815f1a24d73f3b72c8deb52c13be75b7c841e49ac8eab21a2 Sep 30 13:55:26 crc kubenswrapper[4936]: I0930 13:55:26.326352 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef87a30a-fcc3-4259-b917-a0975285ddce" path="/var/lib/kubelet/pods/ef87a30a-fcc3-4259-b917-a0975285ddce/volumes" Sep 30 13:55:26 crc kubenswrapper[4936]: I0930 13:55:26.666476 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9d699954-e8fb-482d-83ea-a131998407a1","Type":"ContainerStarted","Data":"31fb0ec1b35bdee815f1a24d73f3b72c8deb52c13be75b7c841e49ac8eab21a2"} Sep 30 13:55:27 crc kubenswrapper[4936]: I0930 13:55:27.674707 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"17d93f7a347eff1cb03a59a1226bb2a542917483154320d58d4c72a501cddc95"} Sep 30 13:55:28 crc kubenswrapper[4936]: E0930 13:55:28.191212 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Sep 30 13:55:28 crc kubenswrapper[4936]: E0930 13:55:28.191273 4936 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Sep 30 13:55:28 crc kubenswrapper[4936]: E0930 13:55:28.191447 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-82z9r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(34168aff-c364-4158-a3e2-ff82841c060c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 13:55:28 crc kubenswrapper[4936]: E0930 13:55:28.192627 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="34168aff-c364-4158-a3e2-ff82841c060c" Sep 30 13:55:28 crc kubenswrapper[4936]: E0930 13:55:28.692127 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="34168aff-c364-4158-a3e2-ff82841c060c" Sep 30 13:55:29 crc kubenswrapper[4936]: I0930 13:55:29.698617 4936 generic.go:334] "Generic (PLEG): container finished" podID="2654d46a-f44e-45b2-862d-55e5eda229b7" containerID="769b57c69598c7b63a7fd3386937d5fd801fd6b973aef600202a5a75bdaa8970" exitCode=0 Sep 30 13:55:29 crc kubenswrapper[4936]: I0930 13:55:29.698787 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-747gv" event={"ID":"2654d46a-f44e-45b2-862d-55e5eda229b7","Type":"ContainerDied","Data":"769b57c69598c7b63a7fd3386937d5fd801fd6b973aef600202a5a75bdaa8970"} Sep 30 13:55:29 crc kubenswrapper[4936]: I0930 13:55:29.703869 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9d699954-e8fb-482d-83ea-a131998407a1","Type":"ContainerStarted","Data":"2805b94ed41cdd7706c752932957a9b4c7323b9ecef92e285e8090a592bc5aeb"} Sep 30 13:55:29 crc kubenswrapper[4936]: I0930 13:55:29.716724 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k5lgl" event={"ID":"491cf6ce-e945-4bd0-b811-b24eed9fcc12","Type":"ContainerStarted","Data":"162b18a4d1f2424e9f448375681abbc3137e215cda14827b369e3a109dd53086"} Sep 30 13:55:29 crc kubenswrapper[4936]: I0930 13:55:29.717025 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-k5lgl" Sep 30 13:55:29 crc kubenswrapper[4936]: I0930 13:55:29.723016 4936 generic.go:334] "Generic (PLEG): container finished" podID="da6fa256-04f4-4d38-a47b-d006a419ca5a" containerID="00533756f71b966b19907b8848af29b3126abc1d533e2885b7618e920da716ea" exitCode=0 Sep 30 13:55:29 crc kubenswrapper[4936]: I0930 13:55:29.723111 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" event={"ID":"da6fa256-04f4-4d38-a47b-d006a419ca5a","Type":"ContainerDied","Data":"00533756f71b966b19907b8848af29b3126abc1d533e2885b7618e920da716ea"} Sep 30 13:55:29 crc kubenswrapper[4936]: I0930 13:55:29.729862 4936 generic.go:334] "Generic (PLEG): container finished" podID="68a0ad3f-c240-417f-ae00-81cc1d640696" containerID="9320a238bdb24501af17d8e737626c6fa6450b269d995246104197f6a1c6333b" exitCode=0 Sep 30 13:55:29 crc kubenswrapper[4936]: I0930 13:55:29.729936 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-pngvg" event={"ID":"68a0ad3f-c240-417f-ae00-81cc1d640696","Type":"ContainerDied","Data":"9320a238bdb24501af17d8e737626c6fa6450b269d995246104197f6a1c6333b"} Sep 30 13:55:29 crc kubenswrapper[4936]: I0930 13:55:29.734818 4936 generic.go:334] "Generic (PLEG): container finished" podID="bfacaf90-7d52-4568-a210-9c2016c7f5cb" containerID="e7de7a99416391c1e5044c749a090e491a7e9bc65edef48ce8198e01bce2f5f2" exitCode=0 Sep 30 13:55:29 crc kubenswrapper[4936]: I0930 13:55:29.734895 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" event={"ID":"bfacaf90-7d52-4568-a210-9c2016c7f5cb","Type":"ContainerDied","Data":"e7de7a99416391c1e5044c749a090e491a7e9bc65edef48ce8198e01bce2f5f2"} Sep 30 13:55:29 crc kubenswrapper[4936]: I0930 13:55:29.742747 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c2a97be5-d38b-4352-883e-1efaf06ce24e","Type":"ContainerStarted","Data":"9b42ae1ff4cc0937f1eb02d7fd9b48ea0e9978c5e4177af190120e58cf1d91ed"} Sep 30 13:55:29 crc kubenswrapper[4936]: I0930 13:55:29.746141 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-k5lgl" podStartSLOduration=4.8831763729999995 podStartE2EDuration="30.746122006s" podCreationTimestamp="2025-09-30 13:54:59 +0000 UTC" firstStartedPulling="2025-09-30 13:55:01.006410899 +0000 UTC m=+951.390413200" lastFinishedPulling="2025-09-30 13:55:26.869356522 +0000 UTC m=+977.253358833" observedRunningTime="2025-09-30 13:55:29.740981496 +0000 UTC m=+980.124983807" watchObservedRunningTime="2025-09-30 13:55:29.746122006 +0000 UTC m=+980.130124307" Sep 30 13:55:30 crc kubenswrapper[4936]: E0930 13:55:30.075864 4936 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Sep 30 13:55:30 crc kubenswrapper[4936]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/bfacaf90-7d52-4568-a210-9c2016c7f5cb/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Sep 30 13:55:30 crc kubenswrapper[4936]: > podSandboxID="c62ea0b3936d662cfad5629c550b9e53886bb357f28aec09556704398de6fe0a" Sep 30 13:55:30 crc kubenswrapper[4936]: E0930 13:55:30.076393 4936 kuberuntime_manager.go:1274] "Unhandled Error" err=< Sep 30 13:55:30 crc kubenswrapper[4936]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-448wr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-2rq2m_openstack(bfacaf90-7d52-4568-a210-9c2016c7f5cb): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/bfacaf90-7d52-4568-a210-9c2016c7f5cb/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Sep 30 13:55:30 crc kubenswrapper[4936]: > logger="UnhandledError" Sep 30 13:55:30 crc kubenswrapper[4936]: E0930 13:55:30.077585 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/bfacaf90-7d52-4568-a210-9c2016c7f5cb/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" podUID="bfacaf90-7d52-4568-a210-9c2016c7f5cb" Sep 30 13:55:30 crc kubenswrapper[4936]: I0930 13:55:30.099700 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-pngvg" Sep 30 13:55:30 crc kubenswrapper[4936]: I0930 13:55:30.194014 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7xz9\" (UniqueName: \"kubernetes.io/projected/68a0ad3f-c240-417f-ae00-81cc1d640696-kube-api-access-t7xz9\") pod \"68a0ad3f-c240-417f-ae00-81cc1d640696\" (UID: \"68a0ad3f-c240-417f-ae00-81cc1d640696\") " Sep 30 13:55:30 crc kubenswrapper[4936]: I0930 13:55:30.194328 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68a0ad3f-c240-417f-ae00-81cc1d640696-config\") pod \"68a0ad3f-c240-417f-ae00-81cc1d640696\" (UID: \"68a0ad3f-c240-417f-ae00-81cc1d640696\") " Sep 30 13:55:30 crc kubenswrapper[4936]: I0930 13:55:30.200619 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68a0ad3f-c240-417f-ae00-81cc1d640696-kube-api-access-t7xz9" (OuterVolumeSpecName: "kube-api-access-t7xz9") pod "68a0ad3f-c240-417f-ae00-81cc1d640696" (UID: "68a0ad3f-c240-417f-ae00-81cc1d640696"). InnerVolumeSpecName "kube-api-access-t7xz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:30 crc kubenswrapper[4936]: I0930 13:55:30.218811 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68a0ad3f-c240-417f-ae00-81cc1d640696-config" (OuterVolumeSpecName: "config") pod "68a0ad3f-c240-417f-ae00-81cc1d640696" (UID: "68a0ad3f-c240-417f-ae00-81cc1d640696"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:30 crc kubenswrapper[4936]: I0930 13:55:30.296387 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7xz9\" (UniqueName: \"kubernetes.io/projected/68a0ad3f-c240-417f-ae00-81cc1d640696-kube-api-access-t7xz9\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:30 crc kubenswrapper[4936]: I0930 13:55:30.296421 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68a0ad3f-c240-417f-ae00-81cc1d640696-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:30 crc kubenswrapper[4936]: I0930 13:55:30.755667 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-747gv" event={"ID":"2654d46a-f44e-45b2-862d-55e5eda229b7","Type":"ContainerStarted","Data":"41809038ab9908941631de9adcbc12f4ae485426fe00701492e36e4d60d031b0"} Sep 30 13:55:30 crc kubenswrapper[4936]: I0930 13:55:30.756019 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-747gv" event={"ID":"2654d46a-f44e-45b2-862d-55e5eda229b7","Type":"ContainerStarted","Data":"c8cdc2c0e8843017d8bb8fa9666bab54965485d20eb12ba1be2442f863cf7769"} Sep 30 13:55:30 crc kubenswrapper[4936]: I0930 13:55:30.756159 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:55:30 crc kubenswrapper[4936]: I0930 13:55:30.756194 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:55:30 crc kubenswrapper[4936]: I0930 13:55:30.759823 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" event={"ID":"da6fa256-04f4-4d38-a47b-d006a419ca5a","Type":"ContainerStarted","Data":"d348b0d5a5453d01631c6042c57d72a481678544500e271b7e726ea81252f934"} Sep 30 13:55:30 crc kubenswrapper[4936]: I0930 13:55:30.760743 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" Sep 30 13:55:30 crc kubenswrapper[4936]: I0930 13:55:30.763909 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-pngvg" Sep 30 13:55:30 crc kubenswrapper[4936]: I0930 13:55:30.764527 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-pngvg" event={"ID":"68a0ad3f-c240-417f-ae00-81cc1d640696","Type":"ContainerDied","Data":"22ca357c4cee0a1e0c8d77a8e9de1f95c94d4f2377bde24c58965e61111866f4"} Sep 30 13:55:30 crc kubenswrapper[4936]: I0930 13:55:30.765468 4936 scope.go:117] "RemoveContainer" containerID="9320a238bdb24501af17d8e737626c6fa6450b269d995246104197f6a1c6333b" Sep 30 13:55:30 crc kubenswrapper[4936]: I0930 13:55:30.783877 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-747gv" podStartSLOduration=16.954338935 podStartE2EDuration="31.783854372s" podCreationTimestamp="2025-09-30 13:54:59 +0000 UTC" firstStartedPulling="2025-09-30 13:55:12.500647101 +0000 UTC m=+962.884649402" lastFinishedPulling="2025-09-30 13:55:27.330162538 +0000 UTC m=+977.714164839" observedRunningTime="2025-09-30 13:55:30.777277392 +0000 UTC m=+981.161279713" watchObservedRunningTime="2025-09-30 13:55:30.783854372 +0000 UTC m=+981.167856683" Sep 30 13:55:30 crc kubenswrapper[4936]: I0930 13:55:30.878018 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pngvg"] Sep 30 13:55:30 crc kubenswrapper[4936]: I0930 13:55:30.887389 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pngvg"] Sep 30 13:55:30 crc kubenswrapper[4936]: I0930 13:55:30.895906 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" podStartSLOduration=7.008012134 podStartE2EDuration="42.895883954s" podCreationTimestamp="2025-09-30 13:54:48 +0000 UTC" firstStartedPulling="2025-09-30 13:54:49.392260202 +0000 UTC m=+939.776262493" lastFinishedPulling="2025-09-30 13:55:25.280132012 +0000 UTC m=+975.664134313" observedRunningTime="2025-09-30 13:55:30.86683358 +0000 UTC m=+981.250835881" watchObservedRunningTime="2025-09-30 13:55:30.895883954 +0000 UTC m=+981.279886265" Sep 30 13:55:31 crc kubenswrapper[4936]: I0930 13:55:31.774637 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" event={"ID":"bfacaf90-7d52-4568-a210-9c2016c7f5cb","Type":"ContainerStarted","Data":"c6430e487cd80fa21063cead4d8261d5110477d9c6472dce90dd0b3753de6b80"} Sep 30 13:55:31 crc kubenswrapper[4936]: I0930 13:55:31.775416 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" Sep 30 13:55:31 crc kubenswrapper[4936]: I0930 13:55:31.799893 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" podStartSLOduration=6.048520086 podStartE2EDuration="43.799872214s" podCreationTimestamp="2025-09-30 13:54:48 +0000 UTC" firstStartedPulling="2025-09-30 13:54:49.117226143 +0000 UTC m=+939.501228444" lastFinishedPulling="2025-09-30 13:55:26.868578271 +0000 UTC m=+977.252580572" observedRunningTime="2025-09-30 13:55:31.790785525 +0000 UTC m=+982.174787846" watchObservedRunningTime="2025-09-30 13:55:31.799872214 +0000 UTC m=+982.183874525" Sep 30 13:55:32 crc kubenswrapper[4936]: I0930 13:55:32.334790 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68a0ad3f-c240-417f-ae00-81cc1d640696" path="/var/lib/kubelet/pods/68a0ad3f-c240-417f-ae00-81cc1d640696/volumes" Sep 30 13:55:38 crc kubenswrapper[4936]: I0930 13:55:38.400495 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" Sep 30 13:55:38 crc kubenswrapper[4936]: I0930 13:55:38.847002 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c2a97be5-d38b-4352-883e-1efaf06ce24e","Type":"ContainerStarted","Data":"81bd9ffc724556e0332b379a7037defb3fbda928cc3f9dbc0ba4b26edb279abb"} Sep 30 13:55:38 crc kubenswrapper[4936]: I0930 13:55:38.848767 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2544e332-54a0-46cc-8077-417e83eed982","Type":"ContainerStarted","Data":"36ffa722ce3a9c2c1ef64116a8b06ce447122a448d36e1922f563ef7c3e09e73"} Sep 30 13:55:38 crc kubenswrapper[4936]: I0930 13:55:38.852364 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9d699954-e8fb-482d-83ea-a131998407a1","Type":"ContainerStarted","Data":"be94d6d4eb825e821f1b228971fe32edda3cf3d657544664f002fb83c65f9a5f"} Sep 30 13:55:38 crc kubenswrapper[4936]: I0930 13:55:38.854956 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bf1fd592-e9a1-4f76-af38-961560e7b6f4","Type":"ContainerStarted","Data":"7d23d36032bcd16c3f026a158ff8a7636fbe1c97e9216ccaf29dded344afc381"} Sep 30 13:55:38 crc kubenswrapper[4936]: I0930 13:55:38.857209 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4205821a-580b-4f4c-9e89-9fa6aae93378","Type":"ContainerStarted","Data":"2eed6569f3c2a30d3dbb0f1f6752f391ab51eec1f293d4c8cc892838895f39f6"} Sep 30 13:55:38 crc kubenswrapper[4936]: I0930 13:55:38.859702 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"22002396-4cfa-4e41-95c0-61672072faa0","Type":"ContainerStarted","Data":"3d9e28cd2db0d2fe92085ea29d9df2b8dbd3e6ebb03e32c690054a5b2c16fdac"} Sep 30 13:55:38 crc kubenswrapper[4936]: I0930 13:55:38.861524 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e3631d52-a6a9-46fc-b109-a8e0b96bac93","Type":"ContainerStarted","Data":"c6afdbe63425bf55294ecb3445200cbcaeb53fedfb9abf5cd618d4c09abbbf9b"} Sep 30 13:55:38 crc kubenswrapper[4936]: I0930 13:55:38.861712 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Sep 30 13:55:38 crc kubenswrapper[4936]: I0930 13:55:38.873990 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=16.703545172 podStartE2EDuration="41.873969018s" podCreationTimestamp="2025-09-30 13:54:57 +0000 UTC" firstStartedPulling="2025-09-30 13:55:12.500343003 +0000 UTC m=+962.884345304" lastFinishedPulling="2025-09-30 13:55:37.670766849 +0000 UTC m=+988.054769150" observedRunningTime="2025-09-30 13:55:38.871942633 +0000 UTC m=+989.255944954" watchObservedRunningTime="2025-09-30 13:55:38.873969018 +0000 UTC m=+989.257971319" Sep 30 13:55:38 crc kubenswrapper[4936]: I0930 13:55:38.874538 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" Sep 30 13:55:38 crc kubenswrapper[4936]: I0930 13:55:38.918850 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=27.426605594 podStartE2EDuration="38.918826434s" podCreationTimestamp="2025-09-30 13:55:00 +0000 UTC" firstStartedPulling="2025-09-30 13:55:26.304011339 +0000 UTC m=+976.688013640" lastFinishedPulling="2025-09-30 13:55:37.796232179 +0000 UTC m=+988.180234480" observedRunningTime="2025-09-30 13:55:38.911740031 +0000 UTC m=+989.295742342" watchObservedRunningTime="2025-09-30 13:55:38.918826434 +0000 UTC m=+989.302828735" Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.026426 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.281558656 podStartE2EDuration="47.026407225s" podCreationTimestamp="2025-09-30 13:54:52 +0000 UTC" firstStartedPulling="2025-09-30 13:54:54.150669434 +0000 UTC m=+944.534671735" lastFinishedPulling="2025-09-30 13:55:37.895517983 +0000 UTC m=+988.279520304" observedRunningTime="2025-09-30 13:55:39.019496776 +0000 UTC m=+989.403499077" watchObservedRunningTime="2025-09-30 13:55:39.026407225 +0000 UTC m=+989.410409526" Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.076960 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2rq2m"] Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.077462 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" podUID="bfacaf90-7d52-4568-a210-9c2016c7f5cb" containerName="dnsmasq-dns" containerID="cri-o://c6430e487cd80fa21063cead4d8261d5110477d9c6472dce90dd0b3753de6b80" gracePeriod=10 Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.468721 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.565856 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.666257 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfacaf90-7d52-4568-a210-9c2016c7f5cb-dns-svc\") pod \"bfacaf90-7d52-4568-a210-9c2016c7f5cb\" (UID: \"bfacaf90-7d52-4568-a210-9c2016c7f5cb\") " Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.667619 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfacaf90-7d52-4568-a210-9c2016c7f5cb-config\") pod \"bfacaf90-7d52-4568-a210-9c2016c7f5cb\" (UID: \"bfacaf90-7d52-4568-a210-9c2016c7f5cb\") " Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.667682 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-448wr\" (UniqueName: \"kubernetes.io/projected/bfacaf90-7d52-4568-a210-9c2016c7f5cb-kube-api-access-448wr\") pod \"bfacaf90-7d52-4568-a210-9c2016c7f5cb\" (UID: \"bfacaf90-7d52-4568-a210-9c2016c7f5cb\") " Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.677101 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfacaf90-7d52-4568-a210-9c2016c7f5cb-kube-api-access-448wr" (OuterVolumeSpecName: "kube-api-access-448wr") pod "bfacaf90-7d52-4568-a210-9c2016c7f5cb" (UID: "bfacaf90-7d52-4568-a210-9c2016c7f5cb"). InnerVolumeSpecName "kube-api-access-448wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.712796 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfacaf90-7d52-4568-a210-9c2016c7f5cb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bfacaf90-7d52-4568-a210-9c2016c7f5cb" (UID: "bfacaf90-7d52-4568-a210-9c2016c7f5cb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.713087 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfacaf90-7d52-4568-a210-9c2016c7f5cb-config" (OuterVolumeSpecName: "config") pod "bfacaf90-7d52-4568-a210-9c2016c7f5cb" (UID: "bfacaf90-7d52-4568-a210-9c2016c7f5cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.770256 4936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfacaf90-7d52-4568-a210-9c2016c7f5cb-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.770617 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfacaf90-7d52-4568-a210-9c2016c7f5cb-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.770773 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-448wr\" (UniqueName: \"kubernetes.io/projected/bfacaf90-7d52-4568-a210-9c2016c7f5cb-kube-api-access-448wr\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.869829 4936 generic.go:334] "Generic (PLEG): container finished" podID="bfacaf90-7d52-4568-a210-9c2016c7f5cb" containerID="c6430e487cd80fa21063cead4d8261d5110477d9c6472dce90dd0b3753de6b80" exitCode=0 Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.869951 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" event={"ID":"bfacaf90-7d52-4568-a210-9c2016c7f5cb","Type":"ContainerDied","Data":"c6430e487cd80fa21063cead4d8261d5110477d9c6472dce90dd0b3753de6b80"} Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.869982 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" event={"ID":"bfacaf90-7d52-4568-a210-9c2016c7f5cb","Type":"ContainerDied","Data":"c62ea0b3936d662cfad5629c550b9e53886bb357f28aec09556704398de6fe0a"} Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.869997 4936 scope.go:117] "RemoveContainer" containerID="c6430e487cd80fa21063cead4d8261d5110477d9c6472dce90dd0b3753de6b80" Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.870180 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-2rq2m" Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.901654 4936 scope.go:117] "RemoveContainer" containerID="e7de7a99416391c1e5044c749a090e491a7e9bc65edef48ce8198e01bce2f5f2" Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.903909 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2rq2m"] Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.912192 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2rq2m"] Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.924620 4936 scope.go:117] "RemoveContainer" containerID="c6430e487cd80fa21063cead4d8261d5110477d9c6472dce90dd0b3753de6b80" Sep 30 13:55:39 crc kubenswrapper[4936]: E0930 13:55:39.925020 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6430e487cd80fa21063cead4d8261d5110477d9c6472dce90dd0b3753de6b80\": container with ID starting with c6430e487cd80fa21063cead4d8261d5110477d9c6472dce90dd0b3753de6b80 not found: ID does not exist" containerID="c6430e487cd80fa21063cead4d8261d5110477d9c6472dce90dd0b3753de6b80" Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.925057 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6430e487cd80fa21063cead4d8261d5110477d9c6472dce90dd0b3753de6b80"} err="failed to get container status \"c6430e487cd80fa21063cead4d8261d5110477d9c6472dce90dd0b3753de6b80\": rpc error: code = NotFound desc = could not find container \"c6430e487cd80fa21063cead4d8261d5110477d9c6472dce90dd0b3753de6b80\": container with ID starting with c6430e487cd80fa21063cead4d8261d5110477d9c6472dce90dd0b3753de6b80 not found: ID does not exist" Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.925100 4936 scope.go:117] "RemoveContainer" containerID="e7de7a99416391c1e5044c749a090e491a7e9bc65edef48ce8198e01bce2f5f2" Sep 30 13:55:39 crc kubenswrapper[4936]: E0930 13:55:39.925437 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7de7a99416391c1e5044c749a090e491a7e9bc65edef48ce8198e01bce2f5f2\": container with ID starting with e7de7a99416391c1e5044c749a090e491a7e9bc65edef48ce8198e01bce2f5f2 not found: ID does not exist" containerID="e7de7a99416391c1e5044c749a090e491a7e9bc65edef48ce8198e01bce2f5f2" Sep 30 13:55:39 crc kubenswrapper[4936]: I0930 13:55:39.925467 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7de7a99416391c1e5044c749a090e491a7e9bc65edef48ce8198e01bce2f5f2"} err="failed to get container status \"e7de7a99416391c1e5044c749a090e491a7e9bc65edef48ce8198e01bce2f5f2\": rpc error: code = NotFound desc = could not find container \"e7de7a99416391c1e5044c749a090e491a7e9bc65edef48ce8198e01bce2f5f2\": container with ID starting with e7de7a99416391c1e5044c749a090e491a7e9bc65edef48ce8198e01bce2f5f2 not found: ID does not exist" Sep 30 13:55:40 crc kubenswrapper[4936]: I0930 13:55:40.335622 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfacaf90-7d52-4568-a210-9c2016c7f5cb" path="/var/lib/kubelet/pods/bfacaf90-7d52-4568-a210-9c2016c7f5cb/volumes" Sep 30 13:55:41 crc kubenswrapper[4936]: I0930 13:55:41.310454 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:41 crc kubenswrapper[4936]: I0930 13:55:41.399492 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:41 crc kubenswrapper[4936]: I0930 13:55:41.466931 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Sep 30 13:55:41 crc kubenswrapper[4936]: I0930 13:55:41.506021 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Sep 30 13:55:41 crc kubenswrapper[4936]: I0930 13:55:41.883995 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:41 crc kubenswrapper[4936]: I0930 13:55:41.917740 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Sep 30 13:55:41 crc kubenswrapper[4936]: I0930 13:55:41.917792 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.196161 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-jcv7b"] Sep 30 13:55:42 crc kubenswrapper[4936]: E0930 13:55:42.196700 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68a0ad3f-c240-417f-ae00-81cc1d640696" containerName="init" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.196723 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="68a0ad3f-c240-417f-ae00-81cc1d640696" containerName="init" Sep 30 13:55:42 crc kubenswrapper[4936]: E0930 13:55:42.196756 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfacaf90-7d52-4568-a210-9c2016c7f5cb" containerName="init" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.196765 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfacaf90-7d52-4568-a210-9c2016c7f5cb" containerName="init" Sep 30 13:55:42 crc kubenswrapper[4936]: E0930 13:55:42.196806 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfacaf90-7d52-4568-a210-9c2016c7f5cb" containerName="dnsmasq-dns" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.196815 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfacaf90-7d52-4568-a210-9c2016c7f5cb" containerName="dnsmasq-dns" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.197030 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfacaf90-7d52-4568-a210-9c2016c7f5cb" containerName="dnsmasq-dns" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.197057 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="68a0ad3f-c240-417f-ae00-81cc1d640696" containerName="init" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.198406 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-jcv7b" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.200977 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.214801 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-jcv7b"] Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.310871 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvdzx\" (UniqueName: \"kubernetes.io/projected/fa16be59-a2ae-4e02-9d85-53793977f45e-kube-api-access-lvdzx\") pod \"dnsmasq-dns-7fd796d7df-jcv7b\" (UID: \"fa16be59-a2ae-4e02-9d85-53793977f45e\") " pod="openstack/dnsmasq-dns-7fd796d7df-jcv7b" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.311110 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa16be59-a2ae-4e02-9d85-53793977f45e-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-jcv7b\" (UID: \"fa16be59-a2ae-4e02-9d85-53793977f45e\") " pod="openstack/dnsmasq-dns-7fd796d7df-jcv7b" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.311152 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa16be59-a2ae-4e02-9d85-53793977f45e-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-jcv7b\" (UID: \"fa16be59-a2ae-4e02-9d85-53793977f45e\") " pod="openstack/dnsmasq-dns-7fd796d7df-jcv7b" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.311224 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa16be59-a2ae-4e02-9d85-53793977f45e-config\") pod \"dnsmasq-dns-7fd796d7df-jcv7b\" (UID: \"fa16be59-a2ae-4e02-9d85-53793977f45e\") " pod="openstack/dnsmasq-dns-7fd796d7df-jcv7b" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.344794 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-krf5p"] Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.351132 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-krf5p" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.353590 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.379849 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-krf5p"] Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.412757 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa16be59-a2ae-4e02-9d85-53793977f45e-config\") pod \"dnsmasq-dns-7fd796d7df-jcv7b\" (UID: \"fa16be59-a2ae-4e02-9d85-53793977f45e\") " pod="openstack/dnsmasq-dns-7fd796d7df-jcv7b" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.412828 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvdzx\" (UniqueName: \"kubernetes.io/projected/fa16be59-a2ae-4e02-9d85-53793977f45e-kube-api-access-lvdzx\") pod \"dnsmasq-dns-7fd796d7df-jcv7b\" (UID: \"fa16be59-a2ae-4e02-9d85-53793977f45e\") " pod="openstack/dnsmasq-dns-7fd796d7df-jcv7b" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.412860 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9443e0ef-1389-4d6a-a44d-f9863071e734-ovn-rundir\") pod \"ovn-controller-metrics-krf5p\" (UID: \"9443e0ef-1389-4d6a-a44d-f9863071e734\") " pod="openstack/ovn-controller-metrics-krf5p" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.412941 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9443e0ef-1389-4d6a-a44d-f9863071e734-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-krf5p\" (UID: \"9443e0ef-1389-4d6a-a44d-f9863071e734\") " pod="openstack/ovn-controller-metrics-krf5p" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.412996 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9443e0ef-1389-4d6a-a44d-f9863071e734-config\") pod \"ovn-controller-metrics-krf5p\" (UID: \"9443e0ef-1389-4d6a-a44d-f9863071e734\") " pod="openstack/ovn-controller-metrics-krf5p" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.413018 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdrf6\" (UniqueName: \"kubernetes.io/projected/9443e0ef-1389-4d6a-a44d-f9863071e734-kube-api-access-kdrf6\") pod \"ovn-controller-metrics-krf5p\" (UID: \"9443e0ef-1389-4d6a-a44d-f9863071e734\") " pod="openstack/ovn-controller-metrics-krf5p" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.413065 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9443e0ef-1389-4d6a-a44d-f9863071e734-ovs-rundir\") pod \"ovn-controller-metrics-krf5p\" (UID: \"9443e0ef-1389-4d6a-a44d-f9863071e734\") " pod="openstack/ovn-controller-metrics-krf5p" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.413098 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa16be59-a2ae-4e02-9d85-53793977f45e-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-jcv7b\" (UID: \"fa16be59-a2ae-4e02-9d85-53793977f45e\") " pod="openstack/dnsmasq-dns-7fd796d7df-jcv7b" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.413127 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9443e0ef-1389-4d6a-a44d-f9863071e734-combined-ca-bundle\") pod \"ovn-controller-metrics-krf5p\" (UID: \"9443e0ef-1389-4d6a-a44d-f9863071e734\") " pod="openstack/ovn-controller-metrics-krf5p" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.413143 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa16be59-a2ae-4e02-9d85-53793977f45e-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-jcv7b\" (UID: \"fa16be59-a2ae-4e02-9d85-53793977f45e\") " pod="openstack/dnsmasq-dns-7fd796d7df-jcv7b" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.414090 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa16be59-a2ae-4e02-9d85-53793977f45e-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-jcv7b\" (UID: \"fa16be59-a2ae-4e02-9d85-53793977f45e\") " pod="openstack/dnsmasq-dns-7fd796d7df-jcv7b" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.414579 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa16be59-a2ae-4e02-9d85-53793977f45e-config\") pod \"dnsmasq-dns-7fd796d7df-jcv7b\" (UID: \"fa16be59-a2ae-4e02-9d85-53793977f45e\") " pod="openstack/dnsmasq-dns-7fd796d7df-jcv7b" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.414758 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa16be59-a2ae-4e02-9d85-53793977f45e-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-jcv7b\" (UID: \"fa16be59-a2ae-4e02-9d85-53793977f45e\") " pod="openstack/dnsmasq-dns-7fd796d7df-jcv7b" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.458999 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvdzx\" (UniqueName: \"kubernetes.io/projected/fa16be59-a2ae-4e02-9d85-53793977f45e-kube-api-access-lvdzx\") pod \"dnsmasq-dns-7fd796d7df-jcv7b\" (UID: \"fa16be59-a2ae-4e02-9d85-53793977f45e\") " pod="openstack/dnsmasq-dns-7fd796d7df-jcv7b" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.513194 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-jcv7b"] Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.513875 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-jcv7b" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.514286 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9443e0ef-1389-4d6a-a44d-f9863071e734-combined-ca-bundle\") pod \"ovn-controller-metrics-krf5p\" (UID: \"9443e0ef-1389-4d6a-a44d-f9863071e734\") " pod="openstack/ovn-controller-metrics-krf5p" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.514470 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9443e0ef-1389-4d6a-a44d-f9863071e734-ovn-rundir\") pod \"ovn-controller-metrics-krf5p\" (UID: \"9443e0ef-1389-4d6a-a44d-f9863071e734\") " pod="openstack/ovn-controller-metrics-krf5p" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.514533 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9443e0ef-1389-4d6a-a44d-f9863071e734-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-krf5p\" (UID: \"9443e0ef-1389-4d6a-a44d-f9863071e734\") " pod="openstack/ovn-controller-metrics-krf5p" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.514574 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9443e0ef-1389-4d6a-a44d-f9863071e734-config\") pod \"ovn-controller-metrics-krf5p\" (UID: \"9443e0ef-1389-4d6a-a44d-f9863071e734\") " pod="openstack/ovn-controller-metrics-krf5p" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.514596 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdrf6\" (UniqueName: \"kubernetes.io/projected/9443e0ef-1389-4d6a-a44d-f9863071e734-kube-api-access-kdrf6\") pod \"ovn-controller-metrics-krf5p\" (UID: \"9443e0ef-1389-4d6a-a44d-f9863071e734\") " pod="openstack/ovn-controller-metrics-krf5p" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.514628 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9443e0ef-1389-4d6a-a44d-f9863071e734-ovs-rundir\") pod \"ovn-controller-metrics-krf5p\" (UID: \"9443e0ef-1389-4d6a-a44d-f9863071e734\") " pod="openstack/ovn-controller-metrics-krf5p" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.514689 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9443e0ef-1389-4d6a-a44d-f9863071e734-ovn-rundir\") pod \"ovn-controller-metrics-krf5p\" (UID: \"9443e0ef-1389-4d6a-a44d-f9863071e734\") " pod="openstack/ovn-controller-metrics-krf5p" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.514746 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9443e0ef-1389-4d6a-a44d-f9863071e734-ovs-rundir\") pod \"ovn-controller-metrics-krf5p\" (UID: \"9443e0ef-1389-4d6a-a44d-f9863071e734\") " pod="openstack/ovn-controller-metrics-krf5p" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.515313 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9443e0ef-1389-4d6a-a44d-f9863071e734-config\") pod \"ovn-controller-metrics-krf5p\" (UID: \"9443e0ef-1389-4d6a-a44d-f9863071e734\") " pod="openstack/ovn-controller-metrics-krf5p" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.522269 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9443e0ef-1389-4d6a-a44d-f9863071e734-combined-ca-bundle\") pod \"ovn-controller-metrics-krf5p\" (UID: \"9443e0ef-1389-4d6a-a44d-f9863071e734\") " pod="openstack/ovn-controller-metrics-krf5p" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.542967 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9443e0ef-1389-4d6a-a44d-f9863071e734-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-krf5p\" (UID: \"9443e0ef-1389-4d6a-a44d-f9863071e734\") " pod="openstack/ovn-controller-metrics-krf5p" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.579616 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdrf6\" (UniqueName: \"kubernetes.io/projected/9443e0ef-1389-4d6a-a44d-f9863071e734-kube-api-access-kdrf6\") pod \"ovn-controller-metrics-krf5p\" (UID: \"9443e0ef-1389-4d6a-a44d-f9863071e734\") " pod="openstack/ovn-controller-metrics-krf5p" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.582132 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-qxw5c"] Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.597853 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.608321 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.633524 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.636792 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.644153 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.645515 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-hswr4" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.646613 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.646911 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.649709 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-qxw5c"] Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.660053 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.670860 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-krf5p" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.722139 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-config\") pod \"dnsmasq-dns-86db49b7ff-qxw5c\" (UID: \"183c122f-9990-4f38-b78f-6b70607064d6\") " pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.722186 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1be0b97-e1de-4660-bd26-ec0a106cde3d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.722213 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-qxw5c\" (UID: \"183c122f-9990-4f38-b78f-6b70607064d6\") " pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.722234 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f1be0b97-e1de-4660-bd26-ec0a106cde3d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.722288 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qztl7\" (UniqueName: \"kubernetes.io/projected/183c122f-9990-4f38-b78f-6b70607064d6-kube-api-access-qztl7\") pod \"dnsmasq-dns-86db49b7ff-qxw5c\" (UID: \"183c122f-9990-4f38-b78f-6b70607064d6\") " pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.722314 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-qxw5c\" (UID: \"183c122f-9990-4f38-b78f-6b70607064d6\") " pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.722346 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1be0b97-e1de-4660-bd26-ec0a106cde3d-scripts\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.722439 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwfmb\" (UniqueName: \"kubernetes.io/projected/f1be0b97-e1de-4660-bd26-ec0a106cde3d-kube-api-access-xwfmb\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.722493 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1be0b97-e1de-4660-bd26-ec0a106cde3d-config\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.722560 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1be0b97-e1de-4660-bd26-ec0a106cde3d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.722633 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1be0b97-e1de-4660-bd26-ec0a106cde3d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.722684 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-qxw5c\" (UID: \"183c122f-9990-4f38-b78f-6b70607064d6\") " pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.824309 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1be0b97-e1de-4660-bd26-ec0a106cde3d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.824691 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-qxw5c\" (UID: \"183c122f-9990-4f38-b78f-6b70607064d6\") " pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.824728 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-config\") pod \"dnsmasq-dns-86db49b7ff-qxw5c\" (UID: \"183c122f-9990-4f38-b78f-6b70607064d6\") " pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.824760 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1be0b97-e1de-4660-bd26-ec0a106cde3d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.824782 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-qxw5c\" (UID: \"183c122f-9990-4f38-b78f-6b70607064d6\") " pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.824800 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f1be0b97-e1de-4660-bd26-ec0a106cde3d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.824822 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qztl7\" (UniqueName: \"kubernetes.io/projected/183c122f-9990-4f38-b78f-6b70607064d6-kube-api-access-qztl7\") pod \"dnsmasq-dns-86db49b7ff-qxw5c\" (UID: \"183c122f-9990-4f38-b78f-6b70607064d6\") " pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.824848 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-qxw5c\" (UID: \"183c122f-9990-4f38-b78f-6b70607064d6\") " pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.824866 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1be0b97-e1de-4660-bd26-ec0a106cde3d-scripts\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.824880 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwfmb\" (UniqueName: \"kubernetes.io/projected/f1be0b97-e1de-4660-bd26-ec0a106cde3d-kube-api-access-xwfmb\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.824899 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1be0b97-e1de-4660-bd26-ec0a106cde3d-config\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.824929 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1be0b97-e1de-4660-bd26-ec0a106cde3d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.825800 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f1be0b97-e1de-4660-bd26-ec0a106cde3d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.826735 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-qxw5c\" (UID: \"183c122f-9990-4f38-b78f-6b70607064d6\") " pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.827433 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-config\") pod \"dnsmasq-dns-86db49b7ff-qxw5c\" (UID: \"183c122f-9990-4f38-b78f-6b70607064d6\") " pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.830024 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1be0b97-e1de-4660-bd26-ec0a106cde3d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.830815 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1be0b97-e1de-4660-bd26-ec0a106cde3d-config\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.831331 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-qxw5c\" (UID: \"183c122f-9990-4f38-b78f-6b70607064d6\") " pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.831977 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-qxw5c\" (UID: \"183c122f-9990-4f38-b78f-6b70607064d6\") " pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.832464 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1be0b97-e1de-4660-bd26-ec0a106cde3d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.840455 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1be0b97-e1de-4660-bd26-ec0a106cde3d-scripts\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.843796 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1be0b97-e1de-4660-bd26-ec0a106cde3d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.849552 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwfmb\" (UniqueName: \"kubernetes.io/projected/f1be0b97-e1de-4660-bd26-ec0a106cde3d-kube-api-access-xwfmb\") pod \"ovn-northd-0\" (UID: \"f1be0b97-e1de-4660-bd26-ec0a106cde3d\") " pod="openstack/ovn-northd-0" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.872863 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qztl7\" (UniqueName: \"kubernetes.io/projected/183c122f-9990-4f38-b78f-6b70607064d6-kube-api-access-qztl7\") pod \"dnsmasq-dns-86db49b7ff-qxw5c\" (UID: \"183c122f-9990-4f38-b78f-6b70607064d6\") " pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.931951 4936 generic.go:334] "Generic (PLEG): container finished" podID="2544e332-54a0-46cc-8077-417e83eed982" containerID="36ffa722ce3a9c2c1ef64116a8b06ce447122a448d36e1922f563ef7c3e09e73" exitCode=0 Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.932017 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2544e332-54a0-46cc-8077-417e83eed982","Type":"ContainerDied","Data":"36ffa722ce3a9c2c1ef64116a8b06ce447122a448d36e1922f563ef7c3e09e73"} Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.940244 4936 generic.go:334] "Generic (PLEG): container finished" podID="4205821a-580b-4f4c-9e89-9fa6aae93378" containerID="2eed6569f3c2a30d3dbb0f1f6752f391ab51eec1f293d4c8cc892838895f39f6" exitCode=0 Sep 30 13:55:42 crc kubenswrapper[4936]: I0930 13:55:42.940790 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4205821a-580b-4f4c-9e89-9fa6aae93378","Type":"ContainerDied","Data":"2eed6569f3c2a30d3dbb0f1f6752f391ab51eec1f293d4c8cc892838895f39f6"} Sep 30 13:55:43 crc kubenswrapper[4936]: I0930 13:55:43.001826 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:55:43 crc kubenswrapper[4936]: I0930 13:55:43.034219 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 13:55:43 crc kubenswrapper[4936]: I0930 13:55:43.137575 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Sep 30 13:55:43 crc kubenswrapper[4936]: I0930 13:55:43.289377 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-jcv7b"] Sep 30 13:55:43 crc kubenswrapper[4936]: I0930 13:55:43.366236 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-krf5p"] Sep 30 13:55:43 crc kubenswrapper[4936]: I0930 13:55:43.680880 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 13:55:43 crc kubenswrapper[4936]: I0930 13:55:43.711317 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-qxw5c"] Sep 30 13:55:43 crc kubenswrapper[4936]: I0930 13:55:43.952188 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f1be0b97-e1de-4660-bd26-ec0a106cde3d","Type":"ContainerStarted","Data":"1e4ce79a2b894d9d7cb956e3ef8fbff1c903f97f5e75ef80098396c81a921f57"} Sep 30 13:55:43 crc kubenswrapper[4936]: I0930 13:55:43.956871 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4205821a-580b-4f4c-9e89-9fa6aae93378","Type":"ContainerStarted","Data":"c4722a6c5ee7ea32d59168b8145fcb8c4cbee43cc4882c58bbf4004d47b1e5da"} Sep 30 13:55:43 crc kubenswrapper[4936]: I0930 13:55:43.959223 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-jcv7b" event={"ID":"fa16be59-a2ae-4e02-9d85-53793977f45e","Type":"ContainerStarted","Data":"7bee62ea49b01c379f27c2af3a0121f8712b5d0f3e59cf7355922c2e623110c7"} Sep 30 13:55:43 crc kubenswrapper[4936]: I0930 13:55:43.961184 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" event={"ID":"183c122f-9990-4f38-b78f-6b70607064d6","Type":"ContainerStarted","Data":"cb0d7ac0f9515c593bf754d094800edf7ae1727321d17a125c1dccbe473b629c"} Sep 30 13:55:43 crc kubenswrapper[4936]: I0930 13:55:43.966229 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2544e332-54a0-46cc-8077-417e83eed982","Type":"ContainerStarted","Data":"bc6411c9fd36a9b7a4f22176f66b60763a645a4bd86f8d48dbf5da7ddba1d554"} Sep 30 13:55:43 crc kubenswrapper[4936]: I0930 13:55:43.969745 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-krf5p" event={"ID":"9443e0ef-1389-4d6a-a44d-f9863071e734","Type":"ContainerStarted","Data":"71f3b76cca3ebc6382013b4c310d96486be969bce82e2609c457a16d6ecc4c03"} Sep 30 13:55:46 crc kubenswrapper[4936]: I0930 13:55:46.013763 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=10.143668302 podStartE2EDuration="56.013737588s" podCreationTimestamp="2025-09-30 13:54:50 +0000 UTC" firstStartedPulling="2025-09-30 13:54:52.22969567 +0000 UTC m=+942.613697961" lastFinishedPulling="2025-09-30 13:55:38.099764946 +0000 UTC m=+988.483767247" observedRunningTime="2025-09-30 13:55:46.012135734 +0000 UTC m=+996.396138045" watchObservedRunningTime="2025-09-30 13:55:46.013737588 +0000 UTC m=+996.397739889" Sep 30 13:55:48 crc kubenswrapper[4936]: I0930 13:55:48.005940 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-krf5p" event={"ID":"9443e0ef-1389-4d6a-a44d-f9863071e734","Type":"ContainerStarted","Data":"c95bc3c62c72066648657a31997aab54b937229260b5de31ea9bdeabf29c67a3"} Sep 30 13:55:48 crc kubenswrapper[4936]: I0930 13:55:48.008765 4936 generic.go:334] "Generic (PLEG): container finished" podID="fa16be59-a2ae-4e02-9d85-53793977f45e" containerID="a02f7bd834f7acefec9e87161590d9087b357a6a855eb65a13ed4b1a038c2608" exitCode=0 Sep 30 13:55:48 crc kubenswrapper[4936]: I0930 13:55:48.008861 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-jcv7b" event={"ID":"fa16be59-a2ae-4e02-9d85-53793977f45e","Type":"ContainerDied","Data":"a02f7bd834f7acefec9e87161590d9087b357a6a855eb65a13ed4b1a038c2608"} Sep 30 13:55:48 crc kubenswrapper[4936]: I0930 13:55:48.011789 4936 generic.go:334] "Generic (PLEG): container finished" podID="183c122f-9990-4f38-b78f-6b70607064d6" containerID="d0e9782fb59dbc1181eed5b75aa9f014202c363ac0131b190763062e3c7a6176" exitCode=0 Sep 30 13:55:48 crc kubenswrapper[4936]: I0930 13:55:48.013007 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" event={"ID":"183c122f-9990-4f38-b78f-6b70607064d6","Type":"ContainerDied","Data":"d0e9782fb59dbc1181eed5b75aa9f014202c363ac0131b190763062e3c7a6176"} Sep 30 13:55:48 crc kubenswrapper[4936]: I0930 13:55:48.103499 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-krf5p" podStartSLOduration=6.103477901 podStartE2EDuration="6.103477901s" podCreationTimestamp="2025-09-30 13:55:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:55:48.024735639 +0000 UTC m=+998.408737940" watchObservedRunningTime="2025-09-30 13:55:48.103477901 +0000 UTC m=+998.487480202" Sep 30 13:55:48 crc kubenswrapper[4936]: I0930 13:55:48.174153 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=13.162819817 podStartE2EDuration="57.174127132s" podCreationTimestamp="2025-09-30 13:54:51 +0000 UTC" firstStartedPulling="2025-09-30 13:54:54.09063567 +0000 UTC m=+944.474637971" lastFinishedPulling="2025-09-30 13:55:38.101942985 +0000 UTC m=+988.485945286" observedRunningTime="2025-09-30 13:55:48.136129764 +0000 UTC m=+998.520132065" watchObservedRunningTime="2025-09-30 13:55:48.174127132 +0000 UTC m=+998.558129433" Sep 30 13:55:48 crc kubenswrapper[4936]: I0930 13:55:48.555823 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-jcv7b" Sep 30 13:55:48 crc kubenswrapper[4936]: I0930 13:55:48.644981 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa16be59-a2ae-4e02-9d85-53793977f45e-ovsdbserver-nb\") pod \"fa16be59-a2ae-4e02-9d85-53793977f45e\" (UID: \"fa16be59-a2ae-4e02-9d85-53793977f45e\") " Sep 30 13:55:48 crc kubenswrapper[4936]: I0930 13:55:48.645042 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvdzx\" (UniqueName: \"kubernetes.io/projected/fa16be59-a2ae-4e02-9d85-53793977f45e-kube-api-access-lvdzx\") pod \"fa16be59-a2ae-4e02-9d85-53793977f45e\" (UID: \"fa16be59-a2ae-4e02-9d85-53793977f45e\") " Sep 30 13:55:48 crc kubenswrapper[4936]: I0930 13:55:48.645210 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa16be59-a2ae-4e02-9d85-53793977f45e-config\") pod \"fa16be59-a2ae-4e02-9d85-53793977f45e\" (UID: \"fa16be59-a2ae-4e02-9d85-53793977f45e\") " Sep 30 13:55:48 crc kubenswrapper[4936]: I0930 13:55:48.645228 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa16be59-a2ae-4e02-9d85-53793977f45e-dns-svc\") pod \"fa16be59-a2ae-4e02-9d85-53793977f45e\" (UID: \"fa16be59-a2ae-4e02-9d85-53793977f45e\") " Sep 30 13:55:48 crc kubenswrapper[4936]: I0930 13:55:48.662909 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa16be59-a2ae-4e02-9d85-53793977f45e-kube-api-access-lvdzx" (OuterVolumeSpecName: "kube-api-access-lvdzx") pod "fa16be59-a2ae-4e02-9d85-53793977f45e" (UID: "fa16be59-a2ae-4e02-9d85-53793977f45e"). InnerVolumeSpecName "kube-api-access-lvdzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:48 crc kubenswrapper[4936]: I0930 13:55:48.742351 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa16be59-a2ae-4e02-9d85-53793977f45e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fa16be59-a2ae-4e02-9d85-53793977f45e" (UID: "fa16be59-a2ae-4e02-9d85-53793977f45e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:48 crc kubenswrapper[4936]: I0930 13:55:48.750564 4936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa16be59-a2ae-4e02-9d85-53793977f45e-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:48 crc kubenswrapper[4936]: I0930 13:55:48.750600 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvdzx\" (UniqueName: \"kubernetes.io/projected/fa16be59-a2ae-4e02-9d85-53793977f45e-kube-api-access-lvdzx\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:48 crc kubenswrapper[4936]: I0930 13:55:48.760127 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa16be59-a2ae-4e02-9d85-53793977f45e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fa16be59-a2ae-4e02-9d85-53793977f45e" (UID: "fa16be59-a2ae-4e02-9d85-53793977f45e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:48 crc kubenswrapper[4936]: I0930 13:55:48.763388 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa16be59-a2ae-4e02-9d85-53793977f45e-config" (OuterVolumeSpecName: "config") pod "fa16be59-a2ae-4e02-9d85-53793977f45e" (UID: "fa16be59-a2ae-4e02-9d85-53793977f45e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:48 crc kubenswrapper[4936]: I0930 13:55:48.851512 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa16be59-a2ae-4e02-9d85-53793977f45e-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:48 crc kubenswrapper[4936]: I0930 13:55:48.851548 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa16be59-a2ae-4e02-9d85-53793977f45e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:49 crc kubenswrapper[4936]: I0930 13:55:49.048102 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-jcv7b" event={"ID":"fa16be59-a2ae-4e02-9d85-53793977f45e","Type":"ContainerDied","Data":"7bee62ea49b01c379f27c2af3a0121f8712b5d0f3e59cf7355922c2e623110c7"} Sep 30 13:55:49 crc kubenswrapper[4936]: I0930 13:55:49.048442 4936 scope.go:117] "RemoveContainer" containerID="a02f7bd834f7acefec9e87161590d9087b357a6a855eb65a13ed4b1a038c2608" Sep 30 13:55:49 crc kubenswrapper[4936]: I0930 13:55:49.048553 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-jcv7b" Sep 30 13:55:49 crc kubenswrapper[4936]: I0930 13:55:49.063682 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"34168aff-c364-4158-a3e2-ff82841c060c","Type":"ContainerStarted","Data":"f3ce4becbff6479591b148a0a18bdcc170625663c71254f0ae5fcfe19cb8c5c1"} Sep 30 13:55:49 crc kubenswrapper[4936]: I0930 13:55:49.064666 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 13:55:49 crc kubenswrapper[4936]: I0930 13:55:49.069605 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" event={"ID":"183c122f-9990-4f38-b78f-6b70607064d6","Type":"ContainerStarted","Data":"4a818229adfc1278fe17e30ef112baa6b8d686bf1210579c5718ad984ff2a863"} Sep 30 13:55:49 crc kubenswrapper[4936]: I0930 13:55:49.101050 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.451792855 podStartE2EDuration="55.101030609s" podCreationTimestamp="2025-09-30 13:54:54 +0000 UTC" firstStartedPulling="2025-09-30 13:54:55.585476405 +0000 UTC m=+945.969478706" lastFinishedPulling="2025-09-30 13:55:48.234714159 +0000 UTC m=+998.618716460" observedRunningTime="2025-09-30 13:55:49.098769467 +0000 UTC m=+999.482771788" watchObservedRunningTime="2025-09-30 13:55:49.101030609 +0000 UTC m=+999.485032910" Sep 30 13:55:49 crc kubenswrapper[4936]: I0930 13:55:49.124544 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" podStartSLOduration=7.124525301 podStartE2EDuration="7.124525301s" podCreationTimestamp="2025-09-30 13:55:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:55:49.124009877 +0000 UTC m=+999.508012198" watchObservedRunningTime="2025-09-30 13:55:49.124525301 +0000 UTC m=+999.508527602" Sep 30 13:55:49 crc kubenswrapper[4936]: I0930 13:55:49.169934 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-jcv7b"] Sep 30 13:55:49 crc kubenswrapper[4936]: I0930 13:55:49.174425 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-jcv7b"] Sep 30 13:55:50 crc kubenswrapper[4936]: I0930 13:55:50.078937 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f1be0b97-e1de-4660-bd26-ec0a106cde3d","Type":"ContainerStarted","Data":"8c702e5babd947075fc8002a1a00db9c3ce282fd18f288ac1da01f2b876ef88e"} Sep 30 13:55:50 crc kubenswrapper[4936]: I0930 13:55:50.079016 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f1be0b97-e1de-4660-bd26-ec0a106cde3d","Type":"ContainerStarted","Data":"76537c90f5487452a6a21dbe71f09b646de8edbfc97b8c46fdfb1533c6e3bf01"} Sep 30 13:55:50 crc kubenswrapper[4936]: I0930 13:55:50.080071 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:55:50 crc kubenswrapper[4936]: I0930 13:55:50.104024 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.742628131 podStartE2EDuration="8.104009054s" podCreationTimestamp="2025-09-30 13:55:42 +0000 UTC" firstStartedPulling="2025-09-30 13:55:43.653696777 +0000 UTC m=+994.037699068" lastFinishedPulling="2025-09-30 13:55:49.01507769 +0000 UTC m=+999.399079991" observedRunningTime="2025-09-30 13:55:50.10093816 +0000 UTC m=+1000.484940461" watchObservedRunningTime="2025-09-30 13:55:50.104009054 +0000 UTC m=+1000.488011355" Sep 30 13:55:50 crc kubenswrapper[4936]: I0930 13:55:50.325499 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa16be59-a2ae-4e02-9d85-53793977f45e" path="/var/lib/kubelet/pods/fa16be59-a2ae-4e02-9d85-53793977f45e/volumes" Sep 30 13:55:51 crc kubenswrapper[4936]: I0930 13:55:51.087411 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Sep 30 13:55:51 crc kubenswrapper[4936]: I0930 13:55:51.484666 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Sep 30 13:55:51 crc kubenswrapper[4936]: I0930 13:55:51.485224 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Sep 30 13:55:53 crc kubenswrapper[4936]: I0930 13:55:53.003924 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:55:53 crc kubenswrapper[4936]: I0930 13:55:53.059868 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rpth5"] Sep 30 13:55:53 crc kubenswrapper[4936]: I0930 13:55:53.060399 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" podUID="da6fa256-04f4-4d38-a47b-d006a419ca5a" containerName="dnsmasq-dns" containerID="cri-o://d348b0d5a5453d01631c6042c57d72a481678544500e271b7e726ea81252f934" gracePeriod=10 Sep 30 13:55:53 crc kubenswrapper[4936]: I0930 13:55:53.078666 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Sep 30 13:55:53 crc kubenswrapper[4936]: I0930 13:55:53.078711 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Sep 30 13:55:53 crc kubenswrapper[4936]: I0930 13:55:53.158840 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Sep 30 13:55:53 crc kubenswrapper[4936]: I0930 13:55:53.257224 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Sep 30 13:55:53 crc kubenswrapper[4936]: I0930 13:55:53.570891 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Sep 30 13:55:53 crc kubenswrapper[4936]: I0930 13:55:53.597110 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" Sep 30 13:55:53 crc kubenswrapper[4936]: I0930 13:55:53.677754 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Sep 30 13:55:53 crc kubenswrapper[4936]: I0930 13:55:53.737186 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nckjr\" (UniqueName: \"kubernetes.io/projected/da6fa256-04f4-4d38-a47b-d006a419ca5a-kube-api-access-nckjr\") pod \"da6fa256-04f4-4d38-a47b-d006a419ca5a\" (UID: \"da6fa256-04f4-4d38-a47b-d006a419ca5a\") " Sep 30 13:55:53 crc kubenswrapper[4936]: I0930 13:55:53.737528 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da6fa256-04f4-4d38-a47b-d006a419ca5a-dns-svc\") pod \"da6fa256-04f4-4d38-a47b-d006a419ca5a\" (UID: \"da6fa256-04f4-4d38-a47b-d006a419ca5a\") " Sep 30 13:55:53 crc kubenswrapper[4936]: I0930 13:55:53.737757 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da6fa256-04f4-4d38-a47b-d006a419ca5a-config\") pod \"da6fa256-04f4-4d38-a47b-d006a419ca5a\" (UID: \"da6fa256-04f4-4d38-a47b-d006a419ca5a\") " Sep 30 13:55:53 crc kubenswrapper[4936]: I0930 13:55:53.766168 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da6fa256-04f4-4d38-a47b-d006a419ca5a-kube-api-access-nckjr" (OuterVolumeSpecName: "kube-api-access-nckjr") pod "da6fa256-04f4-4d38-a47b-d006a419ca5a" (UID: "da6fa256-04f4-4d38-a47b-d006a419ca5a"). InnerVolumeSpecName "kube-api-access-nckjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:53 crc kubenswrapper[4936]: I0930 13:55:53.807626 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da6fa256-04f4-4d38-a47b-d006a419ca5a-config" (OuterVolumeSpecName: "config") pod "da6fa256-04f4-4d38-a47b-d006a419ca5a" (UID: "da6fa256-04f4-4d38-a47b-d006a419ca5a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:53 crc kubenswrapper[4936]: I0930 13:55:53.830458 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da6fa256-04f4-4d38-a47b-d006a419ca5a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "da6fa256-04f4-4d38-a47b-d006a419ca5a" (UID: "da6fa256-04f4-4d38-a47b-d006a419ca5a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:55:53 crc kubenswrapper[4936]: I0930 13:55:53.840382 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nckjr\" (UniqueName: \"kubernetes.io/projected/da6fa256-04f4-4d38-a47b-d006a419ca5a-kube-api-access-nckjr\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:53 crc kubenswrapper[4936]: I0930 13:55:53.840423 4936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da6fa256-04f4-4d38-a47b-d006a419ca5a-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:53 crc kubenswrapper[4936]: I0930 13:55:53.840445 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da6fa256-04f4-4d38-a47b-d006a419ca5a-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:54 crc kubenswrapper[4936]: I0930 13:55:54.116664 4936 generic.go:334] "Generic (PLEG): container finished" podID="da6fa256-04f4-4d38-a47b-d006a419ca5a" containerID="d348b0d5a5453d01631c6042c57d72a481678544500e271b7e726ea81252f934" exitCode=0 Sep 30 13:55:54 crc kubenswrapper[4936]: I0930 13:55:54.116747 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" Sep 30 13:55:54 crc kubenswrapper[4936]: I0930 13:55:54.116750 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" event={"ID":"da6fa256-04f4-4d38-a47b-d006a419ca5a","Type":"ContainerDied","Data":"d348b0d5a5453d01631c6042c57d72a481678544500e271b7e726ea81252f934"} Sep 30 13:55:54 crc kubenswrapper[4936]: I0930 13:55:54.116795 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rpth5" event={"ID":"da6fa256-04f4-4d38-a47b-d006a419ca5a","Type":"ContainerDied","Data":"783c4de4277dadb2faf2586499bbb3b3e4a1bd03066aaae1ac68fa985aa8ce38"} Sep 30 13:55:54 crc kubenswrapper[4936]: I0930 13:55:54.116814 4936 scope.go:117] "RemoveContainer" containerID="d348b0d5a5453d01631c6042c57d72a481678544500e271b7e726ea81252f934" Sep 30 13:55:54 crc kubenswrapper[4936]: I0930 13:55:54.140602 4936 scope.go:117] "RemoveContainer" containerID="00533756f71b966b19907b8848af29b3126abc1d533e2885b7618e920da716ea" Sep 30 13:55:54 crc kubenswrapper[4936]: I0930 13:55:54.148442 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rpth5"] Sep 30 13:55:54 crc kubenswrapper[4936]: I0930 13:55:54.155872 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rpth5"] Sep 30 13:55:54 crc kubenswrapper[4936]: I0930 13:55:54.162100 4936 scope.go:117] "RemoveContainer" containerID="d348b0d5a5453d01631c6042c57d72a481678544500e271b7e726ea81252f934" Sep 30 13:55:54 crc kubenswrapper[4936]: E0930 13:55:54.162466 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d348b0d5a5453d01631c6042c57d72a481678544500e271b7e726ea81252f934\": container with ID starting with d348b0d5a5453d01631c6042c57d72a481678544500e271b7e726ea81252f934 not found: ID does not exist" containerID="d348b0d5a5453d01631c6042c57d72a481678544500e271b7e726ea81252f934" Sep 30 13:55:54 crc kubenswrapper[4936]: I0930 13:55:54.162515 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d348b0d5a5453d01631c6042c57d72a481678544500e271b7e726ea81252f934"} err="failed to get container status \"d348b0d5a5453d01631c6042c57d72a481678544500e271b7e726ea81252f934\": rpc error: code = NotFound desc = could not find container \"d348b0d5a5453d01631c6042c57d72a481678544500e271b7e726ea81252f934\": container with ID starting with d348b0d5a5453d01631c6042c57d72a481678544500e271b7e726ea81252f934 not found: ID does not exist" Sep 30 13:55:54 crc kubenswrapper[4936]: I0930 13:55:54.162544 4936 scope.go:117] "RemoveContainer" containerID="00533756f71b966b19907b8848af29b3126abc1d533e2885b7618e920da716ea" Sep 30 13:55:54 crc kubenswrapper[4936]: E0930 13:55:54.162824 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00533756f71b966b19907b8848af29b3126abc1d533e2885b7618e920da716ea\": container with ID starting with 00533756f71b966b19907b8848af29b3126abc1d533e2885b7618e920da716ea not found: ID does not exist" containerID="00533756f71b966b19907b8848af29b3126abc1d533e2885b7618e920da716ea" Sep 30 13:55:54 crc kubenswrapper[4936]: I0930 13:55:54.162855 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00533756f71b966b19907b8848af29b3126abc1d533e2885b7618e920da716ea"} err="failed to get container status \"00533756f71b966b19907b8848af29b3126abc1d533e2885b7618e920da716ea\": rpc error: code = NotFound desc = could not find container \"00533756f71b966b19907b8848af29b3126abc1d533e2885b7618e920da716ea\": container with ID starting with 00533756f71b966b19907b8848af29b3126abc1d533e2885b7618e920da716ea not found: ID does not exist" Sep 30 13:55:54 crc kubenswrapper[4936]: I0930 13:55:54.327127 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da6fa256-04f4-4d38-a47b-d006a419ca5a" path="/var/lib/kubelet/pods/da6fa256-04f4-4d38-a47b-d006a419ca5a/volumes" Sep 30 13:55:54 crc kubenswrapper[4936]: I0930 13:55:54.937689 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 13:55:59 crc kubenswrapper[4936]: I0930 13:55:59.570153 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-k5lgl" podUID="491cf6ce-e945-4bd0-b811-b24eed9fcc12" containerName="ovn-controller" probeResult="failure" output=< Sep 30 13:55:59 crc kubenswrapper[4936]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 30 13:55:59 crc kubenswrapper[4936]: > Sep 30 13:56:02 crc kubenswrapper[4936]: I0930 13:56:02.695268 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-bs47q"] Sep 30 13:56:02 crc kubenswrapper[4936]: E0930 13:56:02.696033 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da6fa256-04f4-4d38-a47b-d006a419ca5a" containerName="init" Sep 30 13:56:02 crc kubenswrapper[4936]: I0930 13:56:02.696044 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="da6fa256-04f4-4d38-a47b-d006a419ca5a" containerName="init" Sep 30 13:56:02 crc kubenswrapper[4936]: E0930 13:56:02.696068 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da6fa256-04f4-4d38-a47b-d006a419ca5a" containerName="dnsmasq-dns" Sep 30 13:56:02 crc kubenswrapper[4936]: I0930 13:56:02.696074 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="da6fa256-04f4-4d38-a47b-d006a419ca5a" containerName="dnsmasq-dns" Sep 30 13:56:02 crc kubenswrapper[4936]: E0930 13:56:02.696089 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa16be59-a2ae-4e02-9d85-53793977f45e" containerName="init" Sep 30 13:56:02 crc kubenswrapper[4936]: I0930 13:56:02.696096 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa16be59-a2ae-4e02-9d85-53793977f45e" containerName="init" Sep 30 13:56:02 crc kubenswrapper[4936]: I0930 13:56:02.696232 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa16be59-a2ae-4e02-9d85-53793977f45e" containerName="init" Sep 30 13:56:02 crc kubenswrapper[4936]: I0930 13:56:02.696247 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="da6fa256-04f4-4d38-a47b-d006a419ca5a" containerName="dnsmasq-dns" Sep 30 13:56:02 crc kubenswrapper[4936]: I0930 13:56:02.696992 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bs47q" Sep 30 13:56:02 crc kubenswrapper[4936]: I0930 13:56:02.714566 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bs47q"] Sep 30 13:56:02 crc kubenswrapper[4936]: I0930 13:56:02.814481 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg846\" (UniqueName: \"kubernetes.io/projected/80c8cc7e-8b32-4662-bf25-0f95cd8aa48e-kube-api-access-lg846\") pod \"keystone-db-create-bs47q\" (UID: \"80c8cc7e-8b32-4662-bf25-0f95cd8aa48e\") " pod="openstack/keystone-db-create-bs47q" Sep 30 13:56:02 crc kubenswrapper[4936]: I0930 13:56:02.902646 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-p5xhr"] Sep 30 13:56:02 crc kubenswrapper[4936]: I0930 13:56:02.903578 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p5xhr" Sep 30 13:56:02 crc kubenswrapper[4936]: I0930 13:56:02.916135 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9vrp\" (UniqueName: \"kubernetes.io/projected/15771d44-77f5-4db8-98d0-588ddcef9157-kube-api-access-d9vrp\") pod \"placement-db-create-p5xhr\" (UID: \"15771d44-77f5-4db8-98d0-588ddcef9157\") " pod="openstack/placement-db-create-p5xhr" Sep 30 13:56:02 crc kubenswrapper[4936]: I0930 13:56:02.916290 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg846\" (UniqueName: \"kubernetes.io/projected/80c8cc7e-8b32-4662-bf25-0f95cd8aa48e-kube-api-access-lg846\") pod \"keystone-db-create-bs47q\" (UID: \"80c8cc7e-8b32-4662-bf25-0f95cd8aa48e\") " pod="openstack/keystone-db-create-bs47q" Sep 30 13:56:02 crc kubenswrapper[4936]: I0930 13:56:02.917925 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-p5xhr"] Sep 30 13:56:02 crc kubenswrapper[4936]: I0930 13:56:02.954298 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg846\" (UniqueName: \"kubernetes.io/projected/80c8cc7e-8b32-4662-bf25-0f95cd8aa48e-kube-api-access-lg846\") pod \"keystone-db-create-bs47q\" (UID: \"80c8cc7e-8b32-4662-bf25-0f95cd8aa48e\") " pod="openstack/keystone-db-create-bs47q" Sep 30 13:56:03 crc kubenswrapper[4936]: I0930 13:56:03.018441 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9vrp\" (UniqueName: \"kubernetes.io/projected/15771d44-77f5-4db8-98d0-588ddcef9157-kube-api-access-d9vrp\") pod \"placement-db-create-p5xhr\" (UID: \"15771d44-77f5-4db8-98d0-588ddcef9157\") " pod="openstack/placement-db-create-p5xhr" Sep 30 13:56:03 crc kubenswrapper[4936]: I0930 13:56:03.018493 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bs47q" Sep 30 13:56:03 crc kubenswrapper[4936]: I0930 13:56:03.040277 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9vrp\" (UniqueName: \"kubernetes.io/projected/15771d44-77f5-4db8-98d0-588ddcef9157-kube-api-access-d9vrp\") pod \"placement-db-create-p5xhr\" (UID: \"15771d44-77f5-4db8-98d0-588ddcef9157\") " pod="openstack/placement-db-create-p5xhr" Sep 30 13:56:03 crc kubenswrapper[4936]: I0930 13:56:03.099086 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Sep 30 13:56:03 crc kubenswrapper[4936]: I0930 13:56:03.219120 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p5xhr" Sep 30 13:56:03 crc kubenswrapper[4936]: I0930 13:56:03.357173 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-gdllr"] Sep 30 13:56:03 crc kubenswrapper[4936]: I0930 13:56:03.360973 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gdllr" Sep 30 13:56:03 crc kubenswrapper[4936]: I0930 13:56:03.364699 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-gdllr"] Sep 30 13:56:03 crc kubenswrapper[4936]: I0930 13:56:03.545759 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bs47q"] Sep 30 13:56:03 crc kubenswrapper[4936]: I0930 13:56:03.546547 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dscgg\" (UniqueName: \"kubernetes.io/projected/98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996-kube-api-access-dscgg\") pod \"glance-db-create-gdllr\" (UID: \"98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996\") " pod="openstack/glance-db-create-gdllr" Sep 30 13:56:03 crc kubenswrapper[4936]: I0930 13:56:03.648546 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dscgg\" (UniqueName: \"kubernetes.io/projected/98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996-kube-api-access-dscgg\") pod \"glance-db-create-gdllr\" (UID: \"98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996\") " pod="openstack/glance-db-create-gdllr" Sep 30 13:56:03 crc kubenswrapper[4936]: I0930 13:56:03.669800 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dscgg\" (UniqueName: \"kubernetes.io/projected/98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996-kube-api-access-dscgg\") pod \"glance-db-create-gdllr\" (UID: \"98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996\") " pod="openstack/glance-db-create-gdllr" Sep 30 13:56:03 crc kubenswrapper[4936]: I0930 13:56:03.685631 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gdllr" Sep 30 13:56:03 crc kubenswrapper[4936]: I0930 13:56:03.757095 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-p5xhr"] Sep 30 13:56:03 crc kubenswrapper[4936]: W0930 13:56:03.787561 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15771d44_77f5_4db8_98d0_588ddcef9157.slice/crio-c93b57a8d1cbc99718c137da0d6cdb06892da120aaab5283b428d0702ccba752 WatchSource:0}: Error finding container c93b57a8d1cbc99718c137da0d6cdb06892da120aaab5283b428d0702ccba752: Status 404 returned error can't find the container with id c93b57a8d1cbc99718c137da0d6cdb06892da120aaab5283b428d0702ccba752 Sep 30 13:56:03 crc kubenswrapper[4936]: E0930 13:56:03.923945 4936 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80c8cc7e_8b32_4662_bf25_0f95cd8aa48e.slice/crio-conmon-66dcfefdb2eafb4a182a0538c511380c43fb7a29673f850650cad36d67359aba.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80c8cc7e_8b32_4662_bf25_0f95cd8aa48e.slice/crio-66dcfefdb2eafb4a182a0538c511380c43fb7a29673f850650cad36d67359aba.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.188382 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-gdllr"] Sep 30 13:56:04 crc kubenswrapper[4936]: W0930 13:56:04.193222 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98bc6478_04f5_4a0a_a1fc_5f4b7f2ea996.slice/crio-9fd3189934b270b047114cd8bc05814f04b4d07240cf66e394788205e1c9b815 WatchSource:0}: Error finding container 9fd3189934b270b047114cd8bc05814f04b4d07240cf66e394788205e1c9b815: Status 404 returned error can't find the container with id 9fd3189934b270b047114cd8bc05814f04b4d07240cf66e394788205e1c9b815 Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.210247 4936 generic.go:334] "Generic (PLEG): container finished" podID="15771d44-77f5-4db8-98d0-588ddcef9157" containerID="9acf4412ffb072a291b0b1a040b54e051f3a732523c9ffe7c0409b2e4457514a" exitCode=0 Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.210377 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p5xhr" event={"ID":"15771d44-77f5-4db8-98d0-588ddcef9157","Type":"ContainerDied","Data":"9acf4412ffb072a291b0b1a040b54e051f3a732523c9ffe7c0409b2e4457514a"} Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.210404 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p5xhr" event={"ID":"15771d44-77f5-4db8-98d0-588ddcef9157","Type":"ContainerStarted","Data":"c93b57a8d1cbc99718c137da0d6cdb06892da120aaab5283b428d0702ccba752"} Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.213082 4936 generic.go:334] "Generic (PLEG): container finished" podID="80c8cc7e-8b32-4662-bf25-0f95cd8aa48e" containerID="66dcfefdb2eafb4a182a0538c511380c43fb7a29673f850650cad36d67359aba" exitCode=0 Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.213219 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bs47q" event={"ID":"80c8cc7e-8b32-4662-bf25-0f95cd8aa48e","Type":"ContainerDied","Data":"66dcfefdb2eafb4a182a0538c511380c43fb7a29673f850650cad36d67359aba"} Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.213271 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bs47q" event={"ID":"80c8cc7e-8b32-4662-bf25-0f95cd8aa48e","Type":"ContainerStarted","Data":"dc371fd03c668c67965f21f45af712cfec390f74bebecec4890b4b3ce0a3e3a4"} Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.214449 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gdllr" event={"ID":"98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996","Type":"ContainerStarted","Data":"9fd3189934b270b047114cd8bc05814f04b4d07240cf66e394788205e1c9b815"} Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.431723 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.442177 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-747gv" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.569276 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-k5lgl" podUID="491cf6ce-e945-4bd0-b811-b24eed9fcc12" containerName="ovn-controller" probeResult="failure" output=< Sep 30 13:56:04 crc kubenswrapper[4936]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 30 13:56:04 crc kubenswrapper[4936]: > Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.647564 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-k5lgl-config-8wvkg"] Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.648743 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.650369 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.663759 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-k5lgl-config-8wvkg"] Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.791325 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca166f01-e5c0-467b-9d33-b0511e05fad0-var-log-ovn\") pod \"ovn-controller-k5lgl-config-8wvkg\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.791504 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca166f01-e5c0-467b-9d33-b0511e05fad0-var-run-ovn\") pod \"ovn-controller-k5lgl-config-8wvkg\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.791608 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnb97\" (UniqueName: \"kubernetes.io/projected/ca166f01-e5c0-467b-9d33-b0511e05fad0-kube-api-access-mnb97\") pod \"ovn-controller-k5lgl-config-8wvkg\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.791679 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ca166f01-e5c0-467b-9d33-b0511e05fad0-additional-scripts\") pod \"ovn-controller-k5lgl-config-8wvkg\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.791712 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca166f01-e5c0-467b-9d33-b0511e05fad0-var-run\") pod \"ovn-controller-k5lgl-config-8wvkg\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.791758 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca166f01-e5c0-467b-9d33-b0511e05fad0-scripts\") pod \"ovn-controller-k5lgl-config-8wvkg\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.892960 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca166f01-e5c0-467b-9d33-b0511e05fad0-var-log-ovn\") pod \"ovn-controller-k5lgl-config-8wvkg\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.893025 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca166f01-e5c0-467b-9d33-b0511e05fad0-var-run-ovn\") pod \"ovn-controller-k5lgl-config-8wvkg\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.893056 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnb97\" (UniqueName: \"kubernetes.io/projected/ca166f01-e5c0-467b-9d33-b0511e05fad0-kube-api-access-mnb97\") pod \"ovn-controller-k5lgl-config-8wvkg\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.893142 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ca166f01-e5c0-467b-9d33-b0511e05fad0-additional-scripts\") pod \"ovn-controller-k5lgl-config-8wvkg\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.893188 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca166f01-e5c0-467b-9d33-b0511e05fad0-var-run\") pod \"ovn-controller-k5lgl-config-8wvkg\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.893227 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca166f01-e5c0-467b-9d33-b0511e05fad0-scripts\") pod \"ovn-controller-k5lgl-config-8wvkg\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.893367 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca166f01-e5c0-467b-9d33-b0511e05fad0-var-run-ovn\") pod \"ovn-controller-k5lgl-config-8wvkg\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.893364 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca166f01-e5c0-467b-9d33-b0511e05fad0-var-run\") pod \"ovn-controller-k5lgl-config-8wvkg\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.893364 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca166f01-e5c0-467b-9d33-b0511e05fad0-var-log-ovn\") pod \"ovn-controller-k5lgl-config-8wvkg\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.894019 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ca166f01-e5c0-467b-9d33-b0511e05fad0-additional-scripts\") pod \"ovn-controller-k5lgl-config-8wvkg\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.895276 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca166f01-e5c0-467b-9d33-b0511e05fad0-scripts\") pod \"ovn-controller-k5lgl-config-8wvkg\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.912047 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnb97\" (UniqueName: \"kubernetes.io/projected/ca166f01-e5c0-467b-9d33-b0511e05fad0-kube-api-access-mnb97\") pod \"ovn-controller-k5lgl-config-8wvkg\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:04 crc kubenswrapper[4936]: I0930 13:56:04.963364 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:05 crc kubenswrapper[4936]: I0930 13:56:05.222735 4936 generic.go:334] "Generic (PLEG): container finished" podID="98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996" containerID="654b9df2ff2d0010cb049bcf7ee486cecc64a65db3ef9f0c15bedad1a0f8c669" exitCode=0 Sep 30 13:56:05 crc kubenswrapper[4936]: I0930 13:56:05.223637 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gdllr" event={"ID":"98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996","Type":"ContainerDied","Data":"654b9df2ff2d0010cb049bcf7ee486cecc64a65db3ef9f0c15bedad1a0f8c669"} Sep 30 13:56:05 crc kubenswrapper[4936]: I0930 13:56:05.401409 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-k5lgl-config-8wvkg"] Sep 30 13:56:05 crc kubenswrapper[4936]: I0930 13:56:05.558658 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bs47q" Sep 30 13:56:05 crc kubenswrapper[4936]: I0930 13:56:05.569374 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p5xhr" Sep 30 13:56:05 crc kubenswrapper[4936]: I0930 13:56:05.704948 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg846\" (UniqueName: \"kubernetes.io/projected/80c8cc7e-8b32-4662-bf25-0f95cd8aa48e-kube-api-access-lg846\") pod \"80c8cc7e-8b32-4662-bf25-0f95cd8aa48e\" (UID: \"80c8cc7e-8b32-4662-bf25-0f95cd8aa48e\") " Sep 30 13:56:05 crc kubenswrapper[4936]: I0930 13:56:05.705080 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9vrp\" (UniqueName: \"kubernetes.io/projected/15771d44-77f5-4db8-98d0-588ddcef9157-kube-api-access-d9vrp\") pod \"15771d44-77f5-4db8-98d0-588ddcef9157\" (UID: \"15771d44-77f5-4db8-98d0-588ddcef9157\") " Sep 30 13:56:05 crc kubenswrapper[4936]: I0930 13:56:05.710943 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15771d44-77f5-4db8-98d0-588ddcef9157-kube-api-access-d9vrp" (OuterVolumeSpecName: "kube-api-access-d9vrp") pod "15771d44-77f5-4db8-98d0-588ddcef9157" (UID: "15771d44-77f5-4db8-98d0-588ddcef9157"). InnerVolumeSpecName "kube-api-access-d9vrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:05 crc kubenswrapper[4936]: I0930 13:56:05.710973 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80c8cc7e-8b32-4662-bf25-0f95cd8aa48e-kube-api-access-lg846" (OuterVolumeSpecName: "kube-api-access-lg846") pod "80c8cc7e-8b32-4662-bf25-0f95cd8aa48e" (UID: "80c8cc7e-8b32-4662-bf25-0f95cd8aa48e"). InnerVolumeSpecName "kube-api-access-lg846". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:05 crc kubenswrapper[4936]: I0930 13:56:05.806572 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9vrp\" (UniqueName: \"kubernetes.io/projected/15771d44-77f5-4db8-98d0-588ddcef9157-kube-api-access-d9vrp\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:05 crc kubenswrapper[4936]: I0930 13:56:05.806605 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg846\" (UniqueName: \"kubernetes.io/projected/80c8cc7e-8b32-4662-bf25-0f95cd8aa48e-kube-api-access-lg846\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:06 crc kubenswrapper[4936]: I0930 13:56:06.231448 4936 generic.go:334] "Generic (PLEG): container finished" podID="ca166f01-e5c0-467b-9d33-b0511e05fad0" containerID="add9150671b34395e2a0c04f67604c4a7430a4b6a780472138adada4476b15e1" exitCode=0 Sep 30 13:56:06 crc kubenswrapper[4936]: I0930 13:56:06.231497 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k5lgl-config-8wvkg" event={"ID":"ca166f01-e5c0-467b-9d33-b0511e05fad0","Type":"ContainerDied","Data":"add9150671b34395e2a0c04f67604c4a7430a4b6a780472138adada4476b15e1"} Sep 30 13:56:06 crc kubenswrapper[4936]: I0930 13:56:06.231772 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k5lgl-config-8wvkg" event={"ID":"ca166f01-e5c0-467b-9d33-b0511e05fad0","Type":"ContainerStarted","Data":"05c2a9e625e544c03ebdaa52c8fcb95b5135e80ec0b130ede78cdfc3a8cb391c"} Sep 30 13:56:06 crc kubenswrapper[4936]: I0930 13:56:06.233210 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bs47q" Sep 30 13:56:06 crc kubenswrapper[4936]: I0930 13:56:06.233342 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bs47q" event={"ID":"80c8cc7e-8b32-4662-bf25-0f95cd8aa48e","Type":"ContainerDied","Data":"dc371fd03c668c67965f21f45af712cfec390f74bebecec4890b4b3ce0a3e3a4"} Sep 30 13:56:06 crc kubenswrapper[4936]: I0930 13:56:06.233378 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc371fd03c668c67965f21f45af712cfec390f74bebecec4890b4b3ce0a3e3a4" Sep 30 13:56:06 crc kubenswrapper[4936]: I0930 13:56:06.234809 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p5xhr" event={"ID":"15771d44-77f5-4db8-98d0-588ddcef9157","Type":"ContainerDied","Data":"c93b57a8d1cbc99718c137da0d6cdb06892da120aaab5283b428d0702ccba752"} Sep 30 13:56:06 crc kubenswrapper[4936]: I0930 13:56:06.234843 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c93b57a8d1cbc99718c137da0d6cdb06892da120aaab5283b428d0702ccba752" Sep 30 13:56:06 crc kubenswrapper[4936]: I0930 13:56:06.234815 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p5xhr" Sep 30 13:56:06 crc kubenswrapper[4936]: I0930 13:56:06.549745 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gdllr" Sep 30 13:56:06 crc kubenswrapper[4936]: I0930 13:56:06.719858 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dscgg\" (UniqueName: \"kubernetes.io/projected/98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996-kube-api-access-dscgg\") pod \"98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996\" (UID: \"98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996\") " Sep 30 13:56:06 crc kubenswrapper[4936]: I0930 13:56:06.730575 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996-kube-api-access-dscgg" (OuterVolumeSpecName: "kube-api-access-dscgg") pod "98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996" (UID: "98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996"). InnerVolumeSpecName "kube-api-access-dscgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:06 crc kubenswrapper[4936]: I0930 13:56:06.821930 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dscgg\" (UniqueName: \"kubernetes.io/projected/98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996-kube-api-access-dscgg\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.243535 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gdllr" event={"ID":"98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996","Type":"ContainerDied","Data":"9fd3189934b270b047114cd8bc05814f04b4d07240cf66e394788205e1c9b815"} Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.243851 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fd3189934b270b047114cd8bc05814f04b4d07240cf66e394788205e1c9b815" Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.243570 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gdllr" Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.591610 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.754674 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca166f01-e5c0-467b-9d33-b0511e05fad0-var-run\") pod \"ca166f01-e5c0-467b-9d33-b0511e05fad0\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.755054 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ca166f01-e5c0-467b-9d33-b0511e05fad0-additional-scripts\") pod \"ca166f01-e5c0-467b-9d33-b0511e05fad0\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.755227 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca166f01-e5c0-467b-9d33-b0511e05fad0-var-log-ovn\") pod \"ca166f01-e5c0-467b-9d33-b0511e05fad0\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.755414 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca166f01-e5c0-467b-9d33-b0511e05fad0-scripts\") pod \"ca166f01-e5c0-467b-9d33-b0511e05fad0\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.755543 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca166f01-e5c0-467b-9d33-b0511e05fad0-var-run-ovn\") pod \"ca166f01-e5c0-467b-9d33-b0511e05fad0\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.755709 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnb97\" (UniqueName: \"kubernetes.io/projected/ca166f01-e5c0-467b-9d33-b0511e05fad0-kube-api-access-mnb97\") pod \"ca166f01-e5c0-467b-9d33-b0511e05fad0\" (UID: \"ca166f01-e5c0-467b-9d33-b0511e05fad0\") " Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.754833 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca166f01-e5c0-467b-9d33-b0511e05fad0-var-run" (OuterVolumeSpecName: "var-run") pod "ca166f01-e5c0-467b-9d33-b0511e05fad0" (UID: "ca166f01-e5c0-467b-9d33-b0511e05fad0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.755382 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca166f01-e5c0-467b-9d33-b0511e05fad0-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ca166f01-e5c0-467b-9d33-b0511e05fad0" (UID: "ca166f01-e5c0-467b-9d33-b0511e05fad0"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.755658 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca166f01-e5c0-467b-9d33-b0511e05fad0-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ca166f01-e5c0-467b-9d33-b0511e05fad0" (UID: "ca166f01-e5c0-467b-9d33-b0511e05fad0"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.756019 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca166f01-e5c0-467b-9d33-b0511e05fad0-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ca166f01-e5c0-467b-9d33-b0511e05fad0" (UID: "ca166f01-e5c0-467b-9d33-b0511e05fad0"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.758894 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca166f01-e5c0-467b-9d33-b0511e05fad0-scripts" (OuterVolumeSpecName: "scripts") pod "ca166f01-e5c0-467b-9d33-b0511e05fad0" (UID: "ca166f01-e5c0-467b-9d33-b0511e05fad0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.764667 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca166f01-e5c0-467b-9d33-b0511e05fad0-kube-api-access-mnb97" (OuterVolumeSpecName: "kube-api-access-mnb97") pod "ca166f01-e5c0-467b-9d33-b0511e05fad0" (UID: "ca166f01-e5c0-467b-9d33-b0511e05fad0"). InnerVolumeSpecName "kube-api-access-mnb97". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.858883 4936 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ca166f01-e5c0-467b-9d33-b0511e05fad0-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.859950 4936 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ca166f01-e5c0-467b-9d33-b0511e05fad0-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.860088 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca166f01-e5c0-467b-9d33-b0511e05fad0-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.860147 4936 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca166f01-e5c0-467b-9d33-b0511e05fad0-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.860201 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnb97\" (UniqueName: \"kubernetes.io/projected/ca166f01-e5c0-467b-9d33-b0511e05fad0-kube-api-access-mnb97\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:07 crc kubenswrapper[4936]: I0930 13:56:07.860258 4936 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ca166f01-e5c0-467b-9d33-b0511e05fad0-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.254495 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k5lgl-config-8wvkg" event={"ID":"ca166f01-e5c0-467b-9d33-b0511e05fad0","Type":"ContainerDied","Data":"05c2a9e625e544c03ebdaa52c8fcb95b5135e80ec0b130ede78cdfc3a8cb391c"} Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.254535 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05c2a9e625e544c03ebdaa52c8fcb95b5135e80ec0b130ede78cdfc3a8cb391c" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.254633 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k5lgl-config-8wvkg" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.698521 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-k5lgl-config-8wvkg"] Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.705197 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-k5lgl-config-8wvkg"] Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.820569 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-k5lgl-config-wsnjc"] Sep 30 13:56:08 crc kubenswrapper[4936]: E0930 13:56:08.821375 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80c8cc7e-8b32-4662-bf25-0f95cd8aa48e" containerName="mariadb-database-create" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.821478 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="80c8cc7e-8b32-4662-bf25-0f95cd8aa48e" containerName="mariadb-database-create" Sep 30 13:56:08 crc kubenswrapper[4936]: E0930 13:56:08.821567 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15771d44-77f5-4db8-98d0-588ddcef9157" containerName="mariadb-database-create" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.821654 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="15771d44-77f5-4db8-98d0-588ddcef9157" containerName="mariadb-database-create" Sep 30 13:56:08 crc kubenswrapper[4936]: E0930 13:56:08.821746 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca166f01-e5c0-467b-9d33-b0511e05fad0" containerName="ovn-config" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.821858 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca166f01-e5c0-467b-9d33-b0511e05fad0" containerName="ovn-config" Sep 30 13:56:08 crc kubenswrapper[4936]: E0930 13:56:08.821968 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996" containerName="mariadb-database-create" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.822054 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996" containerName="mariadb-database-create" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.822316 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca166f01-e5c0-467b-9d33-b0511e05fad0" containerName="ovn-config" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.822438 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="80c8cc7e-8b32-4662-bf25-0f95cd8aa48e" containerName="mariadb-database-create" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.822534 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="15771d44-77f5-4db8-98d0-588ddcef9157" containerName="mariadb-database-create" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.822627 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996" containerName="mariadb-database-create" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.823399 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.828290 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.836722 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-k5lgl-config-wsnjc"] Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.872794 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctlnb\" (UniqueName: \"kubernetes.io/projected/c88d6c74-28d4-4f2c-9011-082845bcab64-kube-api-access-ctlnb\") pod \"ovn-controller-k5lgl-config-wsnjc\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.873470 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c88d6c74-28d4-4f2c-9011-082845bcab64-scripts\") pod \"ovn-controller-k5lgl-config-wsnjc\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.873609 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c88d6c74-28d4-4f2c-9011-082845bcab64-var-log-ovn\") pod \"ovn-controller-k5lgl-config-wsnjc\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.873721 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c88d6c74-28d4-4f2c-9011-082845bcab64-var-run-ovn\") pod \"ovn-controller-k5lgl-config-wsnjc\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.873874 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c88d6c74-28d4-4f2c-9011-082845bcab64-var-run\") pod \"ovn-controller-k5lgl-config-wsnjc\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.874029 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c88d6c74-28d4-4f2c-9011-082845bcab64-additional-scripts\") pod \"ovn-controller-k5lgl-config-wsnjc\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.974601 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctlnb\" (UniqueName: \"kubernetes.io/projected/c88d6c74-28d4-4f2c-9011-082845bcab64-kube-api-access-ctlnb\") pod \"ovn-controller-k5lgl-config-wsnjc\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.974660 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c88d6c74-28d4-4f2c-9011-082845bcab64-scripts\") pod \"ovn-controller-k5lgl-config-wsnjc\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.974708 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c88d6c74-28d4-4f2c-9011-082845bcab64-var-run-ovn\") pod \"ovn-controller-k5lgl-config-wsnjc\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.974726 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c88d6c74-28d4-4f2c-9011-082845bcab64-var-log-ovn\") pod \"ovn-controller-k5lgl-config-wsnjc\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.974764 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c88d6c74-28d4-4f2c-9011-082845bcab64-var-run\") pod \"ovn-controller-k5lgl-config-wsnjc\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.974846 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c88d6c74-28d4-4f2c-9011-082845bcab64-additional-scripts\") pod \"ovn-controller-k5lgl-config-wsnjc\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.975242 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c88d6c74-28d4-4f2c-9011-082845bcab64-var-run-ovn\") pod \"ovn-controller-k5lgl-config-wsnjc\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.975803 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c88d6c74-28d4-4f2c-9011-082845bcab64-additional-scripts\") pod \"ovn-controller-k5lgl-config-wsnjc\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.975873 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c88d6c74-28d4-4f2c-9011-082845bcab64-var-log-ovn\") pod \"ovn-controller-k5lgl-config-wsnjc\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.975922 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c88d6c74-28d4-4f2c-9011-082845bcab64-var-run\") pod \"ovn-controller-k5lgl-config-wsnjc\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.977239 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c88d6c74-28d4-4f2c-9011-082845bcab64-scripts\") pod \"ovn-controller-k5lgl-config-wsnjc\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:08 crc kubenswrapper[4936]: I0930 13:56:08.994077 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctlnb\" (UniqueName: \"kubernetes.io/projected/c88d6c74-28d4-4f2c-9011-082845bcab64-kube-api-access-ctlnb\") pod \"ovn-controller-k5lgl-config-wsnjc\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:09 crc kubenswrapper[4936]: I0930 13:56:09.166447 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:09 crc kubenswrapper[4936]: I0930 13:56:09.283294 4936 generic.go:334] "Generic (PLEG): container finished" podID="22002396-4cfa-4e41-95c0-61672072faa0" containerID="3d9e28cd2db0d2fe92085ea29d9df2b8dbd3e6ebb03e32c690054a5b2c16fdac" exitCode=0 Sep 30 13:56:09 crc kubenswrapper[4936]: I0930 13:56:09.283361 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"22002396-4cfa-4e41-95c0-61672072faa0","Type":"ContainerDied","Data":"3d9e28cd2db0d2fe92085ea29d9df2b8dbd3e6ebb03e32c690054a5b2c16fdac"} Sep 30 13:56:09 crc kubenswrapper[4936]: I0930 13:56:09.288901 4936 generic.go:334] "Generic (PLEG): container finished" podID="bf1fd592-e9a1-4f76-af38-961560e7b6f4" containerID="7d23d36032bcd16c3f026a158ff8a7636fbe1c97e9216ccaf29dded344afc381" exitCode=0 Sep 30 13:56:09 crc kubenswrapper[4936]: I0930 13:56:09.288937 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bf1fd592-e9a1-4f76-af38-961560e7b6f4","Type":"ContainerDied","Data":"7d23d36032bcd16c3f026a158ff8a7636fbe1c97e9216ccaf29dded344afc381"} Sep 30 13:56:09 crc kubenswrapper[4936]: I0930 13:56:09.565758 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-k5lgl" Sep 30 13:56:09 crc kubenswrapper[4936]: I0930 13:56:09.734188 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-k5lgl-config-wsnjc"] Sep 30 13:56:09 crc kubenswrapper[4936]: W0930 13:56:09.740596 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc88d6c74_28d4_4f2c_9011_082845bcab64.slice/crio-0d79f19a23038ac56c1c4e6b9ab2d73b04143d025a6a77456ab3033767bedf76 WatchSource:0}: Error finding container 0d79f19a23038ac56c1c4e6b9ab2d73b04143d025a6a77456ab3033767bedf76: Status 404 returned error can't find the container with id 0d79f19a23038ac56c1c4e6b9ab2d73b04143d025a6a77456ab3033767bedf76 Sep 30 13:56:10 crc kubenswrapper[4936]: I0930 13:56:10.297736 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bf1fd592-e9a1-4f76-af38-961560e7b6f4","Type":"ContainerStarted","Data":"7fa365f8532602152ab7fa0f35c0ed5f4cb79fa1da22fba6027e4d281675a4f2"} Sep 30 13:56:10 crc kubenswrapper[4936]: I0930 13:56:10.298997 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 13:56:10 crc kubenswrapper[4936]: I0930 13:56:10.301259 4936 generic.go:334] "Generic (PLEG): container finished" podID="c88d6c74-28d4-4f2c-9011-082845bcab64" containerID="fcdb45b1fbb5ca1ed97b759ac903c8b6056e829abc83644d5e679c61a0734034" exitCode=0 Sep 30 13:56:10 crc kubenswrapper[4936]: I0930 13:56:10.301324 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k5lgl-config-wsnjc" event={"ID":"c88d6c74-28d4-4f2c-9011-082845bcab64","Type":"ContainerDied","Data":"fcdb45b1fbb5ca1ed97b759ac903c8b6056e829abc83644d5e679c61a0734034"} Sep 30 13:56:10 crc kubenswrapper[4936]: I0930 13:56:10.301360 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k5lgl-config-wsnjc" event={"ID":"c88d6c74-28d4-4f2c-9011-082845bcab64","Type":"ContainerStarted","Data":"0d79f19a23038ac56c1c4e6b9ab2d73b04143d025a6a77456ab3033767bedf76"} Sep 30 13:56:10 crc kubenswrapper[4936]: I0930 13:56:10.303306 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"22002396-4cfa-4e41-95c0-61672072faa0","Type":"ContainerStarted","Data":"ca6808b9356155232c4ee79ce1b52d14482f1b8a9862c1518cb160909a72f3a0"} Sep 30 13:56:10 crc kubenswrapper[4936]: I0930 13:56:10.304053 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:56:10 crc kubenswrapper[4936]: I0930 13:56:10.341738 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=34.693567813 podStartE2EDuration="1m22.341714546s" podCreationTimestamp="2025-09-30 13:54:48 +0000 UTC" firstStartedPulling="2025-09-30 13:54:50.244929493 +0000 UTC m=+940.628931794" lastFinishedPulling="2025-09-30 13:55:37.893076226 +0000 UTC m=+988.277078527" observedRunningTime="2025-09-30 13:56:10.331595969 +0000 UTC m=+1020.715598270" watchObservedRunningTime="2025-09-30 13:56:10.341714546 +0000 UTC m=+1020.725716847" Sep 30 13:56:10 crc kubenswrapper[4936]: I0930 13:56:10.350479 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca166f01-e5c0-467b-9d33-b0511e05fad0" path="/var/lib/kubelet/pods/ca166f01-e5c0-467b-9d33-b0511e05fad0/volumes" Sep 30 13:56:10 crc kubenswrapper[4936]: I0930 13:56:10.395742 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.123273593 podStartE2EDuration="1m22.395717372s" podCreationTimestamp="2025-09-30 13:54:48 +0000 UTC" firstStartedPulling="2025-09-30 13:54:50.673629536 +0000 UTC m=+941.057631837" lastFinishedPulling="2025-09-30 13:55:37.946073315 +0000 UTC m=+988.330075616" observedRunningTime="2025-09-30 13:56:10.388884195 +0000 UTC m=+1020.772886496" watchObservedRunningTime="2025-09-30 13:56:10.395717372 +0000 UTC m=+1020.779719683" Sep 30 13:56:11 crc kubenswrapper[4936]: I0930 13:56:11.631811 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:11 crc kubenswrapper[4936]: I0930 13:56:11.733955 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c88d6c74-28d4-4f2c-9011-082845bcab64-var-log-ovn\") pod \"c88d6c74-28d4-4f2c-9011-082845bcab64\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " Sep 30 13:56:11 crc kubenswrapper[4936]: I0930 13:56:11.734079 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c88d6c74-28d4-4f2c-9011-082845bcab64-var-run-ovn\") pod \"c88d6c74-28d4-4f2c-9011-082845bcab64\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " Sep 30 13:56:11 crc kubenswrapper[4936]: I0930 13:56:11.734095 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c88d6c74-28d4-4f2c-9011-082845bcab64-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c88d6c74-28d4-4f2c-9011-082845bcab64" (UID: "c88d6c74-28d4-4f2c-9011-082845bcab64"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:56:11 crc kubenswrapper[4936]: I0930 13:56:11.734113 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c88d6c74-28d4-4f2c-9011-082845bcab64-additional-scripts\") pod \"c88d6c74-28d4-4f2c-9011-082845bcab64\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " Sep 30 13:56:11 crc kubenswrapper[4936]: I0930 13:56:11.734224 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctlnb\" (UniqueName: \"kubernetes.io/projected/c88d6c74-28d4-4f2c-9011-082845bcab64-kube-api-access-ctlnb\") pod \"c88d6c74-28d4-4f2c-9011-082845bcab64\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " Sep 30 13:56:11 crc kubenswrapper[4936]: I0930 13:56:11.734272 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c88d6c74-28d4-4f2c-9011-082845bcab64-var-run\") pod \"c88d6c74-28d4-4f2c-9011-082845bcab64\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " Sep 30 13:56:11 crc kubenswrapper[4936]: I0930 13:56:11.734299 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c88d6c74-28d4-4f2c-9011-082845bcab64-scripts\") pod \"c88d6c74-28d4-4f2c-9011-082845bcab64\" (UID: \"c88d6c74-28d4-4f2c-9011-082845bcab64\") " Sep 30 13:56:11 crc kubenswrapper[4936]: I0930 13:56:11.734571 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c88d6c74-28d4-4f2c-9011-082845bcab64-var-run" (OuterVolumeSpecName: "var-run") pod "c88d6c74-28d4-4f2c-9011-082845bcab64" (UID: "c88d6c74-28d4-4f2c-9011-082845bcab64"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:56:11 crc kubenswrapper[4936]: I0930 13:56:11.734602 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c88d6c74-28d4-4f2c-9011-082845bcab64-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c88d6c74-28d4-4f2c-9011-082845bcab64" (UID: "c88d6c74-28d4-4f2c-9011-082845bcab64"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:56:11 crc kubenswrapper[4936]: I0930 13:56:11.734854 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c88d6c74-28d4-4f2c-9011-082845bcab64-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c88d6c74-28d4-4f2c-9011-082845bcab64" (UID: "c88d6c74-28d4-4f2c-9011-082845bcab64"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:56:11 crc kubenswrapper[4936]: I0930 13:56:11.735030 4936 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c88d6c74-28d4-4f2c-9011-082845bcab64-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:11 crc kubenswrapper[4936]: I0930 13:56:11.735051 4936 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c88d6c74-28d4-4f2c-9011-082845bcab64-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:11 crc kubenswrapper[4936]: I0930 13:56:11.735062 4936 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c88d6c74-28d4-4f2c-9011-082845bcab64-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:11 crc kubenswrapper[4936]: I0930 13:56:11.735074 4936 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c88d6c74-28d4-4f2c-9011-082845bcab64-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:11 crc kubenswrapper[4936]: I0930 13:56:11.735321 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c88d6c74-28d4-4f2c-9011-082845bcab64-scripts" (OuterVolumeSpecName: "scripts") pod "c88d6c74-28d4-4f2c-9011-082845bcab64" (UID: "c88d6c74-28d4-4f2c-9011-082845bcab64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:56:11 crc kubenswrapper[4936]: I0930 13:56:11.746700 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c88d6c74-28d4-4f2c-9011-082845bcab64-kube-api-access-ctlnb" (OuterVolumeSpecName: "kube-api-access-ctlnb") pod "c88d6c74-28d4-4f2c-9011-082845bcab64" (UID: "c88d6c74-28d4-4f2c-9011-082845bcab64"). InnerVolumeSpecName "kube-api-access-ctlnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:11 crc kubenswrapper[4936]: I0930 13:56:11.836489 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctlnb\" (UniqueName: \"kubernetes.io/projected/c88d6c74-28d4-4f2c-9011-082845bcab64-kube-api-access-ctlnb\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:11 crc kubenswrapper[4936]: I0930 13:56:11.836516 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c88d6c74-28d4-4f2c-9011-082845bcab64-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:12 crc kubenswrapper[4936]: I0930 13:56:12.323812 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k5lgl-config-wsnjc" Sep 30 13:56:12 crc kubenswrapper[4936]: I0930 13:56:12.325673 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k5lgl-config-wsnjc" event={"ID":"c88d6c74-28d4-4f2c-9011-082845bcab64","Type":"ContainerDied","Data":"0d79f19a23038ac56c1c4e6b9ab2d73b04143d025a6a77456ab3033767bedf76"} Sep 30 13:56:12 crc kubenswrapper[4936]: I0930 13:56:12.325713 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d79f19a23038ac56c1c4e6b9ab2d73b04143d025a6a77456ab3033767bedf76" Sep 30 13:56:12 crc kubenswrapper[4936]: I0930 13:56:12.706921 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-f65f-account-create-8wxzh"] Sep 30 13:56:12 crc kubenswrapper[4936]: E0930 13:56:12.707869 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c88d6c74-28d4-4f2c-9011-082845bcab64" containerName="ovn-config" Sep 30 13:56:12 crc kubenswrapper[4936]: I0930 13:56:12.707884 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="c88d6c74-28d4-4f2c-9011-082845bcab64" containerName="ovn-config" Sep 30 13:56:12 crc kubenswrapper[4936]: I0930 13:56:12.708085 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="c88d6c74-28d4-4f2c-9011-082845bcab64" containerName="ovn-config" Sep 30 13:56:12 crc kubenswrapper[4936]: I0930 13:56:12.708602 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f65f-account-create-8wxzh" Sep 30 13:56:12 crc kubenswrapper[4936]: I0930 13:56:12.710183 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Sep 30 13:56:12 crc kubenswrapper[4936]: I0930 13:56:12.718857 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f65f-account-create-8wxzh"] Sep 30 13:56:12 crc kubenswrapper[4936]: I0930 13:56:12.737795 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-k5lgl-config-wsnjc"] Sep 30 13:56:12 crc kubenswrapper[4936]: I0930 13:56:12.754887 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-k5lgl-config-wsnjc"] Sep 30 13:56:12 crc kubenswrapper[4936]: I0930 13:56:12.851294 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d6vk\" (UniqueName: \"kubernetes.io/projected/40017b63-ca9a-49cf-9e10-40ae1ec179e5-kube-api-access-6d6vk\") pod \"keystone-f65f-account-create-8wxzh\" (UID: \"40017b63-ca9a-49cf-9e10-40ae1ec179e5\") " pod="openstack/keystone-f65f-account-create-8wxzh" Sep 30 13:56:12 crc kubenswrapper[4936]: I0930 13:56:12.953085 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d6vk\" (UniqueName: \"kubernetes.io/projected/40017b63-ca9a-49cf-9e10-40ae1ec179e5-kube-api-access-6d6vk\") pod \"keystone-f65f-account-create-8wxzh\" (UID: \"40017b63-ca9a-49cf-9e10-40ae1ec179e5\") " pod="openstack/keystone-f65f-account-create-8wxzh" Sep 30 13:56:12 crc kubenswrapper[4936]: I0930 13:56:12.971029 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d6vk\" (UniqueName: \"kubernetes.io/projected/40017b63-ca9a-49cf-9e10-40ae1ec179e5-kube-api-access-6d6vk\") pod \"keystone-f65f-account-create-8wxzh\" (UID: \"40017b63-ca9a-49cf-9e10-40ae1ec179e5\") " pod="openstack/keystone-f65f-account-create-8wxzh" Sep 30 13:56:13 crc kubenswrapper[4936]: I0930 13:56:13.023714 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f65f-account-create-8wxzh" Sep 30 13:56:13 crc kubenswrapper[4936]: I0930 13:56:13.072972 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3dd9-account-create-5t4zd"] Sep 30 13:56:13 crc kubenswrapper[4936]: I0930 13:56:13.074179 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3dd9-account-create-5t4zd" Sep 30 13:56:13 crc kubenswrapper[4936]: I0930 13:56:13.078254 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Sep 30 13:56:13 crc kubenswrapper[4936]: I0930 13:56:13.087046 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3dd9-account-create-5t4zd"] Sep 30 13:56:13 crc kubenswrapper[4936]: I0930 13:56:13.156103 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qkm2\" (UniqueName: \"kubernetes.io/projected/ccfcd490-2af5-4e6d-a842-337e4dfd7333-kube-api-access-8qkm2\") pod \"placement-3dd9-account-create-5t4zd\" (UID: \"ccfcd490-2af5-4e6d-a842-337e4dfd7333\") " pod="openstack/placement-3dd9-account-create-5t4zd" Sep 30 13:56:13 crc kubenswrapper[4936]: I0930 13:56:13.258710 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qkm2\" (UniqueName: \"kubernetes.io/projected/ccfcd490-2af5-4e6d-a842-337e4dfd7333-kube-api-access-8qkm2\") pod \"placement-3dd9-account-create-5t4zd\" (UID: \"ccfcd490-2af5-4e6d-a842-337e4dfd7333\") " pod="openstack/placement-3dd9-account-create-5t4zd" Sep 30 13:56:13 crc kubenswrapper[4936]: I0930 13:56:13.276138 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qkm2\" (UniqueName: \"kubernetes.io/projected/ccfcd490-2af5-4e6d-a842-337e4dfd7333-kube-api-access-8qkm2\") pod \"placement-3dd9-account-create-5t4zd\" (UID: \"ccfcd490-2af5-4e6d-a842-337e4dfd7333\") " pod="openstack/placement-3dd9-account-create-5t4zd" Sep 30 13:56:13 crc kubenswrapper[4936]: I0930 13:56:13.370502 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f65f-account-create-8wxzh"] Sep 30 13:56:13 crc kubenswrapper[4936]: W0930 13:56:13.380515 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40017b63_ca9a_49cf_9e10_40ae1ec179e5.slice/crio-62879f260dc780777a9ce6a6d972de3f441fc833cc0f903b621306d59c31a0d0 WatchSource:0}: Error finding container 62879f260dc780777a9ce6a6d972de3f441fc833cc0f903b621306d59c31a0d0: Status 404 returned error can't find the container with id 62879f260dc780777a9ce6a6d972de3f441fc833cc0f903b621306d59c31a0d0 Sep 30 13:56:13 crc kubenswrapper[4936]: I0930 13:56:13.443123 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3dd9-account-create-5t4zd" Sep 30 13:56:13 crc kubenswrapper[4936]: I0930 13:56:13.490375 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-c9ba-account-create-ntmbh"] Sep 30 13:56:13 crc kubenswrapper[4936]: I0930 13:56:13.492487 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c9ba-account-create-ntmbh" Sep 30 13:56:13 crc kubenswrapper[4936]: I0930 13:56:13.498291 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Sep 30 13:56:13 crc kubenswrapper[4936]: I0930 13:56:13.517185 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c9ba-account-create-ntmbh"] Sep 30 13:56:13 crc kubenswrapper[4936]: I0930 13:56:13.568194 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktzmj\" (UniqueName: \"kubernetes.io/projected/670c827e-fdbe-47de-8b0f-be94d7668f2c-kube-api-access-ktzmj\") pod \"glance-c9ba-account-create-ntmbh\" (UID: \"670c827e-fdbe-47de-8b0f-be94d7668f2c\") " pod="openstack/glance-c9ba-account-create-ntmbh" Sep 30 13:56:13 crc kubenswrapper[4936]: I0930 13:56:13.670145 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktzmj\" (UniqueName: \"kubernetes.io/projected/670c827e-fdbe-47de-8b0f-be94d7668f2c-kube-api-access-ktzmj\") pod \"glance-c9ba-account-create-ntmbh\" (UID: \"670c827e-fdbe-47de-8b0f-be94d7668f2c\") " pod="openstack/glance-c9ba-account-create-ntmbh" Sep 30 13:56:13 crc kubenswrapper[4936]: I0930 13:56:13.689999 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktzmj\" (UniqueName: \"kubernetes.io/projected/670c827e-fdbe-47de-8b0f-be94d7668f2c-kube-api-access-ktzmj\") pod \"glance-c9ba-account-create-ntmbh\" (UID: \"670c827e-fdbe-47de-8b0f-be94d7668f2c\") " pod="openstack/glance-c9ba-account-create-ntmbh" Sep 30 13:56:13 crc kubenswrapper[4936]: I0930 13:56:13.811695 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c9ba-account-create-ntmbh" Sep 30 13:56:13 crc kubenswrapper[4936]: W0930 13:56:13.935437 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccfcd490_2af5_4e6d_a842_337e4dfd7333.slice/crio-1536f96ac3bb27d7577a629b92752df25c2fe1a80f812ef8735a68a214813838 WatchSource:0}: Error finding container 1536f96ac3bb27d7577a629b92752df25c2fe1a80f812ef8735a68a214813838: Status 404 returned error can't find the container with id 1536f96ac3bb27d7577a629b92752df25c2fe1a80f812ef8735a68a214813838 Sep 30 13:56:13 crc kubenswrapper[4936]: I0930 13:56:13.937586 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3dd9-account-create-5t4zd"] Sep 30 13:56:14 crc kubenswrapper[4936]: I0930 13:56:14.264709 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c9ba-account-create-ntmbh"] Sep 30 13:56:14 crc kubenswrapper[4936]: W0930 13:56:14.272627 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod670c827e_fdbe_47de_8b0f_be94d7668f2c.slice/crio-0eb94bc55ddc28fdff4790e6f9014ab3f99d6837592ebc202a39719364dbe546 WatchSource:0}: Error finding container 0eb94bc55ddc28fdff4790e6f9014ab3f99d6837592ebc202a39719364dbe546: Status 404 returned error can't find the container with id 0eb94bc55ddc28fdff4790e6f9014ab3f99d6837592ebc202a39719364dbe546 Sep 30 13:56:14 crc kubenswrapper[4936]: I0930 13:56:14.327396 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c88d6c74-28d4-4f2c-9011-082845bcab64" path="/var/lib/kubelet/pods/c88d6c74-28d4-4f2c-9011-082845bcab64/volumes" Sep 30 13:56:14 crc kubenswrapper[4936]: I0930 13:56:14.340691 4936 generic.go:334] "Generic (PLEG): container finished" podID="40017b63-ca9a-49cf-9e10-40ae1ec179e5" containerID="f49c9cfa268a348031096503f9bddc3c976e9c1b5c055dac12816fbd94ee9864" exitCode=0 Sep 30 13:56:14 crc kubenswrapper[4936]: I0930 13:56:14.340751 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f65f-account-create-8wxzh" event={"ID":"40017b63-ca9a-49cf-9e10-40ae1ec179e5","Type":"ContainerDied","Data":"f49c9cfa268a348031096503f9bddc3c976e9c1b5c055dac12816fbd94ee9864"} Sep 30 13:56:14 crc kubenswrapper[4936]: I0930 13:56:14.340774 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f65f-account-create-8wxzh" event={"ID":"40017b63-ca9a-49cf-9e10-40ae1ec179e5","Type":"ContainerStarted","Data":"62879f260dc780777a9ce6a6d972de3f441fc833cc0f903b621306d59c31a0d0"} Sep 30 13:56:14 crc kubenswrapper[4936]: I0930 13:56:14.342262 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c9ba-account-create-ntmbh" event={"ID":"670c827e-fdbe-47de-8b0f-be94d7668f2c","Type":"ContainerStarted","Data":"0eb94bc55ddc28fdff4790e6f9014ab3f99d6837592ebc202a39719364dbe546"} Sep 30 13:56:14 crc kubenswrapper[4936]: I0930 13:56:14.344888 4936 generic.go:334] "Generic (PLEG): container finished" podID="ccfcd490-2af5-4e6d-a842-337e4dfd7333" containerID="9adb2d3bba33510151914d052fa7503c2eb047b7ab39a9206fd63a662109c13f" exitCode=0 Sep 30 13:56:14 crc kubenswrapper[4936]: I0930 13:56:14.344923 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3dd9-account-create-5t4zd" event={"ID":"ccfcd490-2af5-4e6d-a842-337e4dfd7333","Type":"ContainerDied","Data":"9adb2d3bba33510151914d052fa7503c2eb047b7ab39a9206fd63a662109c13f"} Sep 30 13:56:14 crc kubenswrapper[4936]: I0930 13:56:14.344941 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3dd9-account-create-5t4zd" event={"ID":"ccfcd490-2af5-4e6d-a842-337e4dfd7333","Type":"ContainerStarted","Data":"1536f96ac3bb27d7577a629b92752df25c2fe1a80f812ef8735a68a214813838"} Sep 30 13:56:15 crc kubenswrapper[4936]: I0930 13:56:15.353896 4936 generic.go:334] "Generic (PLEG): container finished" podID="670c827e-fdbe-47de-8b0f-be94d7668f2c" containerID="4b837882bbb445147dbafe995ccea4009b68786ba254c56c1951db46f1f0aac9" exitCode=0 Sep 30 13:56:15 crc kubenswrapper[4936]: I0930 13:56:15.354105 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c9ba-account-create-ntmbh" event={"ID":"670c827e-fdbe-47de-8b0f-be94d7668f2c","Type":"ContainerDied","Data":"4b837882bbb445147dbafe995ccea4009b68786ba254c56c1951db46f1f0aac9"} Sep 30 13:56:15 crc kubenswrapper[4936]: I0930 13:56:15.749845 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f65f-account-create-8wxzh" Sep 30 13:56:15 crc kubenswrapper[4936]: I0930 13:56:15.755665 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3dd9-account-create-5t4zd" Sep 30 13:56:15 crc kubenswrapper[4936]: I0930 13:56:15.921199 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d6vk\" (UniqueName: \"kubernetes.io/projected/40017b63-ca9a-49cf-9e10-40ae1ec179e5-kube-api-access-6d6vk\") pod \"40017b63-ca9a-49cf-9e10-40ae1ec179e5\" (UID: \"40017b63-ca9a-49cf-9e10-40ae1ec179e5\") " Sep 30 13:56:15 crc kubenswrapper[4936]: I0930 13:56:15.921384 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qkm2\" (UniqueName: \"kubernetes.io/projected/ccfcd490-2af5-4e6d-a842-337e4dfd7333-kube-api-access-8qkm2\") pod \"ccfcd490-2af5-4e6d-a842-337e4dfd7333\" (UID: \"ccfcd490-2af5-4e6d-a842-337e4dfd7333\") " Sep 30 13:56:15 crc kubenswrapper[4936]: I0930 13:56:15.933031 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40017b63-ca9a-49cf-9e10-40ae1ec179e5-kube-api-access-6d6vk" (OuterVolumeSpecName: "kube-api-access-6d6vk") pod "40017b63-ca9a-49cf-9e10-40ae1ec179e5" (UID: "40017b63-ca9a-49cf-9e10-40ae1ec179e5"). InnerVolumeSpecName "kube-api-access-6d6vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:15 crc kubenswrapper[4936]: I0930 13:56:15.942611 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccfcd490-2af5-4e6d-a842-337e4dfd7333-kube-api-access-8qkm2" (OuterVolumeSpecName: "kube-api-access-8qkm2") pod "ccfcd490-2af5-4e6d-a842-337e4dfd7333" (UID: "ccfcd490-2af5-4e6d-a842-337e4dfd7333"). InnerVolumeSpecName "kube-api-access-8qkm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:16 crc kubenswrapper[4936]: I0930 13:56:16.023657 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qkm2\" (UniqueName: \"kubernetes.io/projected/ccfcd490-2af5-4e6d-a842-337e4dfd7333-kube-api-access-8qkm2\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:16 crc kubenswrapper[4936]: I0930 13:56:16.023685 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d6vk\" (UniqueName: \"kubernetes.io/projected/40017b63-ca9a-49cf-9e10-40ae1ec179e5-kube-api-access-6d6vk\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:16 crc kubenswrapper[4936]: I0930 13:56:16.361071 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f65f-account-create-8wxzh" event={"ID":"40017b63-ca9a-49cf-9e10-40ae1ec179e5","Type":"ContainerDied","Data":"62879f260dc780777a9ce6a6d972de3f441fc833cc0f903b621306d59c31a0d0"} Sep 30 13:56:16 crc kubenswrapper[4936]: I0930 13:56:16.361141 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62879f260dc780777a9ce6a6d972de3f441fc833cc0f903b621306d59c31a0d0" Sep 30 13:56:16 crc kubenswrapper[4936]: I0930 13:56:16.361093 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f65f-account-create-8wxzh" Sep 30 13:56:16 crc kubenswrapper[4936]: I0930 13:56:16.363831 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3dd9-account-create-5t4zd" event={"ID":"ccfcd490-2af5-4e6d-a842-337e4dfd7333","Type":"ContainerDied","Data":"1536f96ac3bb27d7577a629b92752df25c2fe1a80f812ef8735a68a214813838"} Sep 30 13:56:16 crc kubenswrapper[4936]: I0930 13:56:16.363865 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1536f96ac3bb27d7577a629b92752df25c2fe1a80f812ef8735a68a214813838" Sep 30 13:56:16 crc kubenswrapper[4936]: I0930 13:56:16.363840 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3dd9-account-create-5t4zd" Sep 30 13:56:16 crc kubenswrapper[4936]: I0930 13:56:16.603083 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c9ba-account-create-ntmbh" Sep 30 13:56:16 crc kubenswrapper[4936]: I0930 13:56:16.735706 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktzmj\" (UniqueName: \"kubernetes.io/projected/670c827e-fdbe-47de-8b0f-be94d7668f2c-kube-api-access-ktzmj\") pod \"670c827e-fdbe-47de-8b0f-be94d7668f2c\" (UID: \"670c827e-fdbe-47de-8b0f-be94d7668f2c\") " Sep 30 13:56:16 crc kubenswrapper[4936]: I0930 13:56:16.739777 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/670c827e-fdbe-47de-8b0f-be94d7668f2c-kube-api-access-ktzmj" (OuterVolumeSpecName: "kube-api-access-ktzmj") pod "670c827e-fdbe-47de-8b0f-be94d7668f2c" (UID: "670c827e-fdbe-47de-8b0f-be94d7668f2c"). InnerVolumeSpecName "kube-api-access-ktzmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:16 crc kubenswrapper[4936]: I0930 13:56:16.837308 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktzmj\" (UniqueName: \"kubernetes.io/projected/670c827e-fdbe-47de-8b0f-be94d7668f2c-kube-api-access-ktzmj\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:17 crc kubenswrapper[4936]: I0930 13:56:17.373230 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c9ba-account-create-ntmbh" event={"ID":"670c827e-fdbe-47de-8b0f-be94d7668f2c","Type":"ContainerDied","Data":"0eb94bc55ddc28fdff4790e6f9014ab3f99d6837592ebc202a39719364dbe546"} Sep 30 13:56:17 crc kubenswrapper[4936]: I0930 13:56:17.374206 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0eb94bc55ddc28fdff4790e6f9014ab3f99d6837592ebc202a39719364dbe546" Sep 30 13:56:17 crc kubenswrapper[4936]: I0930 13:56:17.373279 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c9ba-account-create-ntmbh" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.650236 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-l85t5"] Sep 30 13:56:18 crc kubenswrapper[4936]: E0930 13:56:18.651773 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="670c827e-fdbe-47de-8b0f-be94d7668f2c" containerName="mariadb-account-create" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.651793 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="670c827e-fdbe-47de-8b0f-be94d7668f2c" containerName="mariadb-account-create" Sep 30 13:56:18 crc kubenswrapper[4936]: E0930 13:56:18.651804 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40017b63-ca9a-49cf-9e10-40ae1ec179e5" containerName="mariadb-account-create" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.651810 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="40017b63-ca9a-49cf-9e10-40ae1ec179e5" containerName="mariadb-account-create" Sep 30 13:56:18 crc kubenswrapper[4936]: E0930 13:56:18.651823 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccfcd490-2af5-4e6d-a842-337e4dfd7333" containerName="mariadb-account-create" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.651828 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccfcd490-2af5-4e6d-a842-337e4dfd7333" containerName="mariadb-account-create" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.652003 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="670c827e-fdbe-47de-8b0f-be94d7668f2c" containerName="mariadb-account-create" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.652017 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="40017b63-ca9a-49cf-9e10-40ae1ec179e5" containerName="mariadb-account-create" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.652027 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccfcd490-2af5-4e6d-a842-337e4dfd7333" containerName="mariadb-account-create" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.652573 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l85t5" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.655441 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.656021 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sxmps" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.667340 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-l85t5"] Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.764667 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pfr6\" (UniqueName: \"kubernetes.io/projected/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-kube-api-access-8pfr6\") pod \"glance-db-sync-l85t5\" (UID: \"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390\") " pod="openstack/glance-db-sync-l85t5" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.764831 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-db-sync-config-data\") pod \"glance-db-sync-l85t5\" (UID: \"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390\") " pod="openstack/glance-db-sync-l85t5" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.764898 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-config-data\") pod \"glance-db-sync-l85t5\" (UID: \"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390\") " pod="openstack/glance-db-sync-l85t5" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.765129 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-combined-ca-bundle\") pod \"glance-db-sync-l85t5\" (UID: \"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390\") " pod="openstack/glance-db-sync-l85t5" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.866959 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-db-sync-config-data\") pod \"glance-db-sync-l85t5\" (UID: \"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390\") " pod="openstack/glance-db-sync-l85t5" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.867077 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-config-data\") pod \"glance-db-sync-l85t5\" (UID: \"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390\") " pod="openstack/glance-db-sync-l85t5" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.867130 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-combined-ca-bundle\") pod \"glance-db-sync-l85t5\" (UID: \"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390\") " pod="openstack/glance-db-sync-l85t5" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.867188 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pfr6\" (UniqueName: \"kubernetes.io/projected/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-kube-api-access-8pfr6\") pod \"glance-db-sync-l85t5\" (UID: \"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390\") " pod="openstack/glance-db-sync-l85t5" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.876661 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-db-sync-config-data\") pod \"glance-db-sync-l85t5\" (UID: \"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390\") " pod="openstack/glance-db-sync-l85t5" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.877706 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-config-data\") pod \"glance-db-sync-l85t5\" (UID: \"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390\") " pod="openstack/glance-db-sync-l85t5" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.877890 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-combined-ca-bundle\") pod \"glance-db-sync-l85t5\" (UID: \"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390\") " pod="openstack/glance-db-sync-l85t5" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.884752 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pfr6\" (UniqueName: \"kubernetes.io/projected/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-kube-api-access-8pfr6\") pod \"glance-db-sync-l85t5\" (UID: \"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390\") " pod="openstack/glance-db-sync-l85t5" Sep 30 13:56:18 crc kubenswrapper[4936]: I0930 13:56:18.969528 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l85t5" Sep 30 13:56:19 crc kubenswrapper[4936]: I0930 13:56:19.308548 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-l85t5"] Sep 30 13:56:19 crc kubenswrapper[4936]: I0930 13:56:19.387425 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l85t5" event={"ID":"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390","Type":"ContainerStarted","Data":"1177032b9324ac9d406e731319de0ff58ab01479f444ac0325dca469f8f79ca5"} Sep 30 13:56:19 crc kubenswrapper[4936]: I0930 13:56:19.621989 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="bf1fd592-e9a1-4f76-af38-961560e7b6f4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Sep 30 13:56:19 crc kubenswrapper[4936]: I0930 13:56:19.977264 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="22002396-4cfa-4e41-95c0-61672072faa0" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Sep 30 13:56:29 crc kubenswrapper[4936]: I0930 13:56:29.623066 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 13:56:29 crc kubenswrapper[4936]: I0930 13:56:29.978525 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.013779 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-szn75"] Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.015324 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-szn75" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.061305 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-szn75"] Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.149546 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hwdv\" (UniqueName: \"kubernetes.io/projected/f6a43128-6ad1-47d4-80ea-afcf5532be4f-kube-api-access-9hwdv\") pod \"cinder-db-create-szn75\" (UID: \"f6a43128-6ad1-47d4-80ea-afcf5532be4f\") " pod="openstack/cinder-db-create-szn75" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.247697 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-j69z9"] Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.248931 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j69z9" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.250802 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hwdv\" (UniqueName: \"kubernetes.io/projected/f6a43128-6ad1-47d4-80ea-afcf5532be4f-kube-api-access-9hwdv\") pod \"cinder-db-create-szn75\" (UID: \"f6a43128-6ad1-47d4-80ea-afcf5532be4f\") " pod="openstack/cinder-db-create-szn75" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.272576 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-j69z9"] Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.304435 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hwdv\" (UniqueName: \"kubernetes.io/projected/f6a43128-6ad1-47d4-80ea-afcf5532be4f-kube-api-access-9hwdv\") pod \"cinder-db-create-szn75\" (UID: \"f6a43128-6ad1-47d4-80ea-afcf5532be4f\") " pod="openstack/cinder-db-create-szn75" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.335938 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-szn75" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.353827 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxrc7\" (UniqueName: \"kubernetes.io/projected/300c85d9-b8e5-4d33-9fd2-86964369fe57-kube-api-access-xxrc7\") pod \"barbican-db-create-j69z9\" (UID: \"300c85d9-b8e5-4d33-9fd2-86964369fe57\") " pod="openstack/barbican-db-create-j69z9" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.446291 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-j2wgp"] Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.448180 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-j2wgp" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.456484 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxrc7\" (UniqueName: \"kubernetes.io/projected/300c85d9-b8e5-4d33-9fd2-86964369fe57-kube-api-access-xxrc7\") pod \"barbican-db-create-j69z9\" (UID: \"300c85d9-b8e5-4d33-9fd2-86964369fe57\") " pod="openstack/barbican-db-create-j69z9" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.462726 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-j2wgp"] Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.504178 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxrc7\" (UniqueName: \"kubernetes.io/projected/300c85d9-b8e5-4d33-9fd2-86964369fe57-kube-api-access-xxrc7\") pod \"barbican-db-create-j69z9\" (UID: \"300c85d9-b8e5-4d33-9fd2-86964369fe57\") " pod="openstack/barbican-db-create-j69z9" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.537545 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-rwxb8"] Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.538552 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rwxb8" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.545521 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.545704 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jszhj" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.545877 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.547603 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.558014 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpbsk\" (UniqueName: \"kubernetes.io/projected/d49846e9-5f5a-4e32-8f59-e9a10e2e98af-kube-api-access-tpbsk\") pod \"neutron-db-create-j2wgp\" (UID: \"d49846e9-5f5a-4e32-8f59-e9a10e2e98af\") " pod="openstack/neutron-db-create-j2wgp" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.567700 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rwxb8"] Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.582684 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j69z9" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.660149 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37485d56-bd76-442f-b986-00ccd913d8cb-combined-ca-bundle\") pod \"keystone-db-sync-rwxb8\" (UID: \"37485d56-bd76-442f-b986-00ccd913d8cb\") " pod="openstack/keystone-db-sync-rwxb8" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.660201 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wngbz\" (UniqueName: \"kubernetes.io/projected/37485d56-bd76-442f-b986-00ccd913d8cb-kube-api-access-wngbz\") pod \"keystone-db-sync-rwxb8\" (UID: \"37485d56-bd76-442f-b986-00ccd913d8cb\") " pod="openstack/keystone-db-sync-rwxb8" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.660232 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpbsk\" (UniqueName: \"kubernetes.io/projected/d49846e9-5f5a-4e32-8f59-e9a10e2e98af-kube-api-access-tpbsk\") pod \"neutron-db-create-j2wgp\" (UID: \"d49846e9-5f5a-4e32-8f59-e9a10e2e98af\") " pod="openstack/neutron-db-create-j2wgp" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.660369 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37485d56-bd76-442f-b986-00ccd913d8cb-config-data\") pod \"keystone-db-sync-rwxb8\" (UID: \"37485d56-bd76-442f-b986-00ccd913d8cb\") " pod="openstack/keystone-db-sync-rwxb8" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.678442 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpbsk\" (UniqueName: \"kubernetes.io/projected/d49846e9-5f5a-4e32-8f59-e9a10e2e98af-kube-api-access-tpbsk\") pod \"neutron-db-create-j2wgp\" (UID: \"d49846e9-5f5a-4e32-8f59-e9a10e2e98af\") " pod="openstack/neutron-db-create-j2wgp" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.761323 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37485d56-bd76-442f-b986-00ccd913d8cb-config-data\") pod \"keystone-db-sync-rwxb8\" (UID: \"37485d56-bd76-442f-b986-00ccd913d8cb\") " pod="openstack/keystone-db-sync-rwxb8" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.761435 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37485d56-bd76-442f-b986-00ccd913d8cb-combined-ca-bundle\") pod \"keystone-db-sync-rwxb8\" (UID: \"37485d56-bd76-442f-b986-00ccd913d8cb\") " pod="openstack/keystone-db-sync-rwxb8" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.761471 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wngbz\" (UniqueName: \"kubernetes.io/projected/37485d56-bd76-442f-b986-00ccd913d8cb-kube-api-access-wngbz\") pod \"keystone-db-sync-rwxb8\" (UID: \"37485d56-bd76-442f-b986-00ccd913d8cb\") " pod="openstack/keystone-db-sync-rwxb8" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.765502 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37485d56-bd76-442f-b986-00ccd913d8cb-combined-ca-bundle\") pod \"keystone-db-sync-rwxb8\" (UID: \"37485d56-bd76-442f-b986-00ccd913d8cb\") " pod="openstack/keystone-db-sync-rwxb8" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.765786 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37485d56-bd76-442f-b986-00ccd913d8cb-config-data\") pod \"keystone-db-sync-rwxb8\" (UID: \"37485d56-bd76-442f-b986-00ccd913d8cb\") " pod="openstack/keystone-db-sync-rwxb8" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.769810 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-j2wgp" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.782509 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wngbz\" (UniqueName: \"kubernetes.io/projected/37485d56-bd76-442f-b986-00ccd913d8cb-kube-api-access-wngbz\") pod \"keystone-db-sync-rwxb8\" (UID: \"37485d56-bd76-442f-b986-00ccd913d8cb\") " pod="openstack/keystone-db-sync-rwxb8" Sep 30 13:56:30 crc kubenswrapper[4936]: I0930 13:56:30.856568 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rwxb8" Sep 30 13:56:33 crc kubenswrapper[4936]: I0930 13:56:33.309223 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rwxb8"] Sep 30 13:56:33 crc kubenswrapper[4936]: I0930 13:56:33.368987 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-j2wgp"] Sep 30 13:56:33 crc kubenswrapper[4936]: I0930 13:56:33.530614 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-j2wgp" event={"ID":"d49846e9-5f5a-4e32-8f59-e9a10e2e98af","Type":"ContainerStarted","Data":"d29aa7ac7db0385a730b4a971ef158c66c5fa3534027004e51136835da17b620"} Sep 30 13:56:33 crc kubenswrapper[4936]: I0930 13:56:33.531528 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rwxb8" event={"ID":"37485d56-bd76-442f-b986-00ccd913d8cb","Type":"ContainerStarted","Data":"afede658b10b2421176d730733f485bb9b253ec14e308cdcd870fed67d54f442"} Sep 30 13:56:33 crc kubenswrapper[4936]: I0930 13:56:33.580777 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-j69z9"] Sep 30 13:56:33 crc kubenswrapper[4936]: I0930 13:56:33.625572 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-szn75"] Sep 30 13:56:33 crc kubenswrapper[4936]: W0930 13:56:33.650803 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6a43128_6ad1_47d4_80ea_afcf5532be4f.slice/crio-9a69c486c59abb42dbd9643a23b83d3c68689bf0b2f6866394f34712fb618968 WatchSource:0}: Error finding container 9a69c486c59abb42dbd9643a23b83d3c68689bf0b2f6866394f34712fb618968: Status 404 returned error can't find the container with id 9a69c486c59abb42dbd9643a23b83d3c68689bf0b2f6866394f34712fb618968 Sep 30 13:56:34 crc kubenswrapper[4936]: I0930 13:56:34.565576 4936 generic.go:334] "Generic (PLEG): container finished" podID="f6a43128-6ad1-47d4-80ea-afcf5532be4f" containerID="a42e8b0c3f257fbc5d911478dda460aa3e7e78b70e5afed7356d2fe09a528238" exitCode=0 Sep 30 13:56:34 crc kubenswrapper[4936]: I0930 13:56:34.565950 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-szn75" event={"ID":"f6a43128-6ad1-47d4-80ea-afcf5532be4f","Type":"ContainerDied","Data":"a42e8b0c3f257fbc5d911478dda460aa3e7e78b70e5afed7356d2fe09a528238"} Sep 30 13:56:34 crc kubenswrapper[4936]: I0930 13:56:34.566169 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-szn75" event={"ID":"f6a43128-6ad1-47d4-80ea-afcf5532be4f","Type":"ContainerStarted","Data":"9a69c486c59abb42dbd9643a23b83d3c68689bf0b2f6866394f34712fb618968"} Sep 30 13:56:34 crc kubenswrapper[4936]: I0930 13:56:34.570916 4936 generic.go:334] "Generic (PLEG): container finished" podID="300c85d9-b8e5-4d33-9fd2-86964369fe57" containerID="f79c7abb49e439f3adfae0dd557fc8b620d7b2a70b6e0403c3ae39f2a693b317" exitCode=0 Sep 30 13:56:34 crc kubenswrapper[4936]: I0930 13:56:34.570972 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-j69z9" event={"ID":"300c85d9-b8e5-4d33-9fd2-86964369fe57","Type":"ContainerDied","Data":"f79c7abb49e439f3adfae0dd557fc8b620d7b2a70b6e0403c3ae39f2a693b317"} Sep 30 13:56:34 crc kubenswrapper[4936]: I0930 13:56:34.570996 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-j69z9" event={"ID":"300c85d9-b8e5-4d33-9fd2-86964369fe57","Type":"ContainerStarted","Data":"931d2d2e1b7cc201453d66ceca5ec7491424e3ae165ae6c5dd76332084f73a5f"} Sep 30 13:56:34 crc kubenswrapper[4936]: I0930 13:56:34.572759 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l85t5" event={"ID":"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390","Type":"ContainerStarted","Data":"aa5c4932ba8990f0d13a11a5f8116494282ce66beadf47c3447a670008750592"} Sep 30 13:56:34 crc kubenswrapper[4936]: I0930 13:56:34.577436 4936 generic.go:334] "Generic (PLEG): container finished" podID="d49846e9-5f5a-4e32-8f59-e9a10e2e98af" containerID="b959c0d179df171e5308d2ca62a7a235d1452bd743dd686f0b9ebb02196478de" exitCode=0 Sep 30 13:56:34 crc kubenswrapper[4936]: I0930 13:56:34.577492 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-j2wgp" event={"ID":"d49846e9-5f5a-4e32-8f59-e9a10e2e98af","Type":"ContainerDied","Data":"b959c0d179df171e5308d2ca62a7a235d1452bd743dd686f0b9ebb02196478de"} Sep 30 13:56:34 crc kubenswrapper[4936]: I0930 13:56:34.663065 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-l85t5" podStartSLOduration=3.06983418 podStartE2EDuration="16.663046409s" podCreationTimestamp="2025-09-30 13:56:18 +0000 UTC" firstStartedPulling="2025-09-30 13:56:19.322543399 +0000 UTC m=+1029.706545690" lastFinishedPulling="2025-09-30 13:56:32.915755618 +0000 UTC m=+1043.299757919" observedRunningTime="2025-09-30 13:56:34.642888028 +0000 UTC m=+1045.026890329" watchObservedRunningTime="2025-09-30 13:56:34.663046409 +0000 UTC m=+1045.047048710" Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.076370 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-szn75" Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.086779 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j69z9" Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.131469 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-j2wgp" Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.267325 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hwdv\" (UniqueName: \"kubernetes.io/projected/f6a43128-6ad1-47d4-80ea-afcf5532be4f-kube-api-access-9hwdv\") pod \"f6a43128-6ad1-47d4-80ea-afcf5532be4f\" (UID: \"f6a43128-6ad1-47d4-80ea-afcf5532be4f\") " Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.268381 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxrc7\" (UniqueName: \"kubernetes.io/projected/300c85d9-b8e5-4d33-9fd2-86964369fe57-kube-api-access-xxrc7\") pod \"300c85d9-b8e5-4d33-9fd2-86964369fe57\" (UID: \"300c85d9-b8e5-4d33-9fd2-86964369fe57\") " Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.268466 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpbsk\" (UniqueName: \"kubernetes.io/projected/d49846e9-5f5a-4e32-8f59-e9a10e2e98af-kube-api-access-tpbsk\") pod \"d49846e9-5f5a-4e32-8f59-e9a10e2e98af\" (UID: \"d49846e9-5f5a-4e32-8f59-e9a10e2e98af\") " Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.275612 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6a43128-6ad1-47d4-80ea-afcf5532be4f-kube-api-access-9hwdv" (OuterVolumeSpecName: "kube-api-access-9hwdv") pod "f6a43128-6ad1-47d4-80ea-afcf5532be4f" (UID: "f6a43128-6ad1-47d4-80ea-afcf5532be4f"). InnerVolumeSpecName "kube-api-access-9hwdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.280509 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/300c85d9-b8e5-4d33-9fd2-86964369fe57-kube-api-access-xxrc7" (OuterVolumeSpecName: "kube-api-access-xxrc7") pod "300c85d9-b8e5-4d33-9fd2-86964369fe57" (UID: "300c85d9-b8e5-4d33-9fd2-86964369fe57"). InnerVolumeSpecName "kube-api-access-xxrc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.289635 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d49846e9-5f5a-4e32-8f59-e9a10e2e98af-kube-api-access-tpbsk" (OuterVolumeSpecName: "kube-api-access-tpbsk") pod "d49846e9-5f5a-4e32-8f59-e9a10e2e98af" (UID: "d49846e9-5f5a-4e32-8f59-e9a10e2e98af"). InnerVolumeSpecName "kube-api-access-tpbsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.369530 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hwdv\" (UniqueName: \"kubernetes.io/projected/f6a43128-6ad1-47d4-80ea-afcf5532be4f-kube-api-access-9hwdv\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.369567 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxrc7\" (UniqueName: \"kubernetes.io/projected/300c85d9-b8e5-4d33-9fd2-86964369fe57-kube-api-access-xxrc7\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.369580 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpbsk\" (UniqueName: \"kubernetes.io/projected/d49846e9-5f5a-4e32-8f59-e9a10e2e98af-kube-api-access-tpbsk\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.594922 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-szn75" Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.594940 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-szn75" event={"ID":"f6a43128-6ad1-47d4-80ea-afcf5532be4f","Type":"ContainerDied","Data":"9a69c486c59abb42dbd9643a23b83d3c68689bf0b2f6866394f34712fb618968"} Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.594965 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a69c486c59abb42dbd9643a23b83d3c68689bf0b2f6866394f34712fb618968" Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.598530 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-j69z9" event={"ID":"300c85d9-b8e5-4d33-9fd2-86964369fe57","Type":"ContainerDied","Data":"931d2d2e1b7cc201453d66ceca5ec7491424e3ae165ae6c5dd76332084f73a5f"} Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.598567 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="931d2d2e1b7cc201453d66ceca5ec7491424e3ae165ae6c5dd76332084f73a5f" Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.598634 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j69z9" Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.601143 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-j2wgp" event={"ID":"d49846e9-5f5a-4e32-8f59-e9a10e2e98af","Type":"ContainerDied","Data":"d29aa7ac7db0385a730b4a971ef158c66c5fa3534027004e51136835da17b620"} Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.601192 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d29aa7ac7db0385a730b4a971ef158c66c5fa3534027004e51136835da17b620" Sep 30 13:56:36 crc kubenswrapper[4936]: I0930 13:56:36.601304 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-j2wgp" Sep 30 13:56:39 crc kubenswrapper[4936]: I0930 13:56:39.651121 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rwxb8" event={"ID":"37485d56-bd76-442f-b986-00ccd913d8cb","Type":"ContainerStarted","Data":"3b22d180dae995bc282493d1c02e5d074063fdcac50fad430a0fe9541c21aa10"} Sep 30 13:56:39 crc kubenswrapper[4936]: I0930 13:56:39.675725 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-rwxb8" podStartSLOduration=3.683057011 podStartE2EDuration="9.675705196s" podCreationTimestamp="2025-09-30 13:56:30 +0000 UTC" firstStartedPulling="2025-09-30 13:56:33.324806919 +0000 UTC m=+1043.708809220" lastFinishedPulling="2025-09-30 13:56:39.317455114 +0000 UTC m=+1049.701457405" observedRunningTime="2025-09-30 13:56:39.67290798 +0000 UTC m=+1050.056910281" watchObservedRunningTime="2025-09-30 13:56:39.675705196 +0000 UTC m=+1050.059707497" Sep 30 13:56:40 crc kubenswrapper[4936]: I0930 13:56:40.085127 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-62f6-account-create-275gf"] Sep 30 13:56:40 crc kubenswrapper[4936]: E0930 13:56:40.085475 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a43128-6ad1-47d4-80ea-afcf5532be4f" containerName="mariadb-database-create" Sep 30 13:56:40 crc kubenswrapper[4936]: I0930 13:56:40.085486 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a43128-6ad1-47d4-80ea-afcf5532be4f" containerName="mariadb-database-create" Sep 30 13:56:40 crc kubenswrapper[4936]: E0930 13:56:40.085497 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="300c85d9-b8e5-4d33-9fd2-86964369fe57" containerName="mariadb-database-create" Sep 30 13:56:40 crc kubenswrapper[4936]: I0930 13:56:40.085504 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="300c85d9-b8e5-4d33-9fd2-86964369fe57" containerName="mariadb-database-create" Sep 30 13:56:40 crc kubenswrapper[4936]: E0930 13:56:40.085518 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d49846e9-5f5a-4e32-8f59-e9a10e2e98af" containerName="mariadb-database-create" Sep 30 13:56:40 crc kubenswrapper[4936]: I0930 13:56:40.085524 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="d49846e9-5f5a-4e32-8f59-e9a10e2e98af" containerName="mariadb-database-create" Sep 30 13:56:40 crc kubenswrapper[4936]: I0930 13:56:40.085704 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="d49846e9-5f5a-4e32-8f59-e9a10e2e98af" containerName="mariadb-database-create" Sep 30 13:56:40 crc kubenswrapper[4936]: I0930 13:56:40.085723 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a43128-6ad1-47d4-80ea-afcf5532be4f" containerName="mariadb-database-create" Sep 30 13:56:40 crc kubenswrapper[4936]: I0930 13:56:40.085732 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="300c85d9-b8e5-4d33-9fd2-86964369fe57" containerName="mariadb-database-create" Sep 30 13:56:40 crc kubenswrapper[4936]: I0930 13:56:40.086224 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-62f6-account-create-275gf" Sep 30 13:56:40 crc kubenswrapper[4936]: I0930 13:56:40.089413 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Sep 30 13:56:40 crc kubenswrapper[4936]: I0930 13:56:40.108672 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-62f6-account-create-275gf"] Sep 30 13:56:40 crc kubenswrapper[4936]: I0930 13:56:40.235600 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgrr8\" (UniqueName: \"kubernetes.io/projected/8743c649-a846-418d-9946-774e4cfb2553-kube-api-access-sgrr8\") pod \"cinder-62f6-account-create-275gf\" (UID: \"8743c649-a846-418d-9946-774e4cfb2553\") " pod="openstack/cinder-62f6-account-create-275gf" Sep 30 13:56:40 crc kubenswrapper[4936]: I0930 13:56:40.338315 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgrr8\" (UniqueName: \"kubernetes.io/projected/8743c649-a846-418d-9946-774e4cfb2553-kube-api-access-sgrr8\") pod \"cinder-62f6-account-create-275gf\" (UID: \"8743c649-a846-418d-9946-774e4cfb2553\") " pod="openstack/cinder-62f6-account-create-275gf" Sep 30 13:56:40 crc kubenswrapper[4936]: I0930 13:56:40.361199 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgrr8\" (UniqueName: \"kubernetes.io/projected/8743c649-a846-418d-9946-774e4cfb2553-kube-api-access-sgrr8\") pod \"cinder-62f6-account-create-275gf\" (UID: \"8743c649-a846-418d-9946-774e4cfb2553\") " pod="openstack/cinder-62f6-account-create-275gf" Sep 30 13:56:40 crc kubenswrapper[4936]: I0930 13:56:40.402618 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-62f6-account-create-275gf" Sep 30 13:56:40 crc kubenswrapper[4936]: I0930 13:56:40.648588 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-62f6-account-create-275gf"] Sep 30 13:56:40 crc kubenswrapper[4936]: W0930 13:56:40.657746 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8743c649_a846_418d_9946_774e4cfb2553.slice/crio-d88528d1e3a91a424985f3f998d0fa424e5ddbc1af762c26c7e4b20e71622881 WatchSource:0}: Error finding container d88528d1e3a91a424985f3f998d0fa424e5ddbc1af762c26c7e4b20e71622881: Status 404 returned error can't find the container with id d88528d1e3a91a424985f3f998d0fa424e5ddbc1af762c26c7e4b20e71622881 Sep 30 13:56:41 crc kubenswrapper[4936]: I0930 13:56:41.668912 4936 generic.go:334] "Generic (PLEG): container finished" podID="8743c649-a846-418d-9946-774e4cfb2553" containerID="7480448e26978a986e25400e86a84488371b674d1a69e6cc990e13d8f9283bee" exitCode=0 Sep 30 13:56:41 crc kubenswrapper[4936]: I0930 13:56:41.668998 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-62f6-account-create-275gf" event={"ID":"8743c649-a846-418d-9946-774e4cfb2553","Type":"ContainerDied","Data":"7480448e26978a986e25400e86a84488371b674d1a69e6cc990e13d8f9283bee"} Sep 30 13:56:41 crc kubenswrapper[4936]: I0930 13:56:41.669206 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-62f6-account-create-275gf" event={"ID":"8743c649-a846-418d-9946-774e4cfb2553","Type":"ContainerStarted","Data":"d88528d1e3a91a424985f3f998d0fa424e5ddbc1af762c26c7e4b20e71622881"} Sep 30 13:56:43 crc kubenswrapper[4936]: I0930 13:56:43.223028 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-62f6-account-create-275gf" Sep 30 13:56:43 crc kubenswrapper[4936]: I0930 13:56:43.287764 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgrr8\" (UniqueName: \"kubernetes.io/projected/8743c649-a846-418d-9946-774e4cfb2553-kube-api-access-sgrr8\") pod \"8743c649-a846-418d-9946-774e4cfb2553\" (UID: \"8743c649-a846-418d-9946-774e4cfb2553\") " Sep 30 13:56:43 crc kubenswrapper[4936]: I0930 13:56:43.302477 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8743c649-a846-418d-9946-774e4cfb2553-kube-api-access-sgrr8" (OuterVolumeSpecName: "kube-api-access-sgrr8") pod "8743c649-a846-418d-9946-774e4cfb2553" (UID: "8743c649-a846-418d-9946-774e4cfb2553"). InnerVolumeSpecName "kube-api-access-sgrr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:43 crc kubenswrapper[4936]: I0930 13:56:43.389604 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgrr8\" (UniqueName: \"kubernetes.io/projected/8743c649-a846-418d-9946-774e4cfb2553-kube-api-access-sgrr8\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:43 crc kubenswrapper[4936]: I0930 13:56:43.683864 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-62f6-account-create-275gf" event={"ID":"8743c649-a846-418d-9946-774e4cfb2553","Type":"ContainerDied","Data":"d88528d1e3a91a424985f3f998d0fa424e5ddbc1af762c26c7e4b20e71622881"} Sep 30 13:56:43 crc kubenswrapper[4936]: I0930 13:56:43.683916 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d88528d1e3a91a424985f3f998d0fa424e5ddbc1af762c26c7e4b20e71622881" Sep 30 13:56:43 crc kubenswrapper[4936]: I0930 13:56:43.684002 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-62f6-account-create-275gf" Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.223979 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-9507-account-create-fkds9"] Sep 30 13:56:50 crc kubenswrapper[4936]: E0930 13:56:50.224829 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8743c649-a846-418d-9946-774e4cfb2553" containerName="mariadb-account-create" Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.224926 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="8743c649-a846-418d-9946-774e4cfb2553" containerName="mariadb-account-create" Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.225114 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="8743c649-a846-418d-9946-774e4cfb2553" containerName="mariadb-account-create" Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.225748 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9507-account-create-fkds9" Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.231096 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.232977 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9507-account-create-fkds9"] Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.415969 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqjsz\" (UniqueName: \"kubernetes.io/projected/660cc428-d532-48bb-8272-d698d7b4b8db-kube-api-access-pqjsz\") pod \"barbican-9507-account-create-fkds9\" (UID: \"660cc428-d532-48bb-8272-d698d7b4b8db\") " pod="openstack/barbican-9507-account-create-fkds9" Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.426662 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-08c3-account-create-cbk2n"] Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.427770 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-08c3-account-create-cbk2n" Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.429616 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.436988 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-08c3-account-create-cbk2n"] Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.517152 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6xqp\" (UniqueName: \"kubernetes.io/projected/9bb49398-578d-45ba-beee-c10d7ecd9b37-kube-api-access-m6xqp\") pod \"neutron-08c3-account-create-cbk2n\" (UID: \"9bb49398-578d-45ba-beee-c10d7ecd9b37\") " pod="openstack/neutron-08c3-account-create-cbk2n" Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.517228 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqjsz\" (UniqueName: \"kubernetes.io/projected/660cc428-d532-48bb-8272-d698d7b4b8db-kube-api-access-pqjsz\") pod \"barbican-9507-account-create-fkds9\" (UID: \"660cc428-d532-48bb-8272-d698d7b4b8db\") " pod="openstack/barbican-9507-account-create-fkds9" Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.537836 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqjsz\" (UniqueName: \"kubernetes.io/projected/660cc428-d532-48bb-8272-d698d7b4b8db-kube-api-access-pqjsz\") pod \"barbican-9507-account-create-fkds9\" (UID: \"660cc428-d532-48bb-8272-d698d7b4b8db\") " pod="openstack/barbican-9507-account-create-fkds9" Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.543999 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9507-account-create-fkds9" Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.618411 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6xqp\" (UniqueName: \"kubernetes.io/projected/9bb49398-578d-45ba-beee-c10d7ecd9b37-kube-api-access-m6xqp\") pod \"neutron-08c3-account-create-cbk2n\" (UID: \"9bb49398-578d-45ba-beee-c10d7ecd9b37\") " pod="openstack/neutron-08c3-account-create-cbk2n" Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.646811 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6xqp\" (UniqueName: \"kubernetes.io/projected/9bb49398-578d-45ba-beee-c10d7ecd9b37-kube-api-access-m6xqp\") pod \"neutron-08c3-account-create-cbk2n\" (UID: \"9bb49398-578d-45ba-beee-c10d7ecd9b37\") " pod="openstack/neutron-08c3-account-create-cbk2n" Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.732597 4936 generic.go:334] "Generic (PLEG): container finished" podID="37485d56-bd76-442f-b986-00ccd913d8cb" containerID="3b22d180dae995bc282493d1c02e5d074063fdcac50fad430a0fe9541c21aa10" exitCode=0 Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.732666 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rwxb8" event={"ID":"37485d56-bd76-442f-b986-00ccd913d8cb","Type":"ContainerDied","Data":"3b22d180dae995bc282493d1c02e5d074063fdcac50fad430a0fe9541c21aa10"} Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.745081 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-08c3-account-create-cbk2n" Sep 30 13:56:50 crc kubenswrapper[4936]: I0930 13:56:50.987864 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9507-account-create-fkds9"] Sep 30 13:56:50 crc kubenswrapper[4936]: W0930 13:56:50.997097 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod660cc428_d532_48bb_8272_d698d7b4b8db.slice/crio-27cbbbec5285091d1856a259fa77ae706d16d334c67ce492d598025ac7ec2c70 WatchSource:0}: Error finding container 27cbbbec5285091d1856a259fa77ae706d16d334c67ce492d598025ac7ec2c70: Status 404 returned error can't find the container with id 27cbbbec5285091d1856a259fa77ae706d16d334c67ce492d598025ac7ec2c70 Sep 30 13:56:51 crc kubenswrapper[4936]: I0930 13:56:51.239459 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-08c3-account-create-cbk2n"] Sep 30 13:56:51 crc kubenswrapper[4936]: W0930 13:56:51.245188 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bb49398_578d_45ba_beee_c10d7ecd9b37.slice/crio-a302d2bc3182ad8db0bf960dbd2c046147abf89157c7a46b09a1c374c7be30f1 WatchSource:0}: Error finding container a302d2bc3182ad8db0bf960dbd2c046147abf89157c7a46b09a1c374c7be30f1: Status 404 returned error can't find the container with id a302d2bc3182ad8db0bf960dbd2c046147abf89157c7a46b09a1c374c7be30f1 Sep 30 13:56:51 crc kubenswrapper[4936]: I0930 13:56:51.741992 4936 generic.go:334] "Generic (PLEG): container finished" podID="660cc428-d532-48bb-8272-d698d7b4b8db" containerID="59fcefdcde957626ff7aec0338852740946e9f031938675b02b14527b29f749b" exitCode=0 Sep 30 13:56:51 crc kubenswrapper[4936]: I0930 13:56:51.742030 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9507-account-create-fkds9" event={"ID":"660cc428-d532-48bb-8272-d698d7b4b8db","Type":"ContainerDied","Data":"59fcefdcde957626ff7aec0338852740946e9f031938675b02b14527b29f749b"} Sep 30 13:56:51 crc kubenswrapper[4936]: I0930 13:56:51.742520 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9507-account-create-fkds9" event={"ID":"660cc428-d532-48bb-8272-d698d7b4b8db","Type":"ContainerStarted","Data":"27cbbbec5285091d1856a259fa77ae706d16d334c67ce492d598025ac7ec2c70"} Sep 30 13:56:51 crc kubenswrapper[4936]: I0930 13:56:51.744685 4936 generic.go:334] "Generic (PLEG): container finished" podID="9bb49398-578d-45ba-beee-c10d7ecd9b37" containerID="e76d4a0c938d00f17d421adc31b7b07ef7fd05d6a389edad54b818cd7c52ad88" exitCode=0 Sep 30 13:56:51 crc kubenswrapper[4936]: I0930 13:56:51.744747 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-08c3-account-create-cbk2n" event={"ID":"9bb49398-578d-45ba-beee-c10d7ecd9b37","Type":"ContainerDied","Data":"e76d4a0c938d00f17d421adc31b7b07ef7fd05d6a389edad54b818cd7c52ad88"} Sep 30 13:56:51 crc kubenswrapper[4936]: I0930 13:56:51.744772 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-08c3-account-create-cbk2n" event={"ID":"9bb49398-578d-45ba-beee-c10d7ecd9b37","Type":"ContainerStarted","Data":"a302d2bc3182ad8db0bf960dbd2c046147abf89157c7a46b09a1c374c7be30f1"} Sep 30 13:56:51 crc kubenswrapper[4936]: I0930 13:56:51.746726 4936 generic.go:334] "Generic (PLEG): container finished" podID="6b5a2293-5cc6-4ee1-92dd-d63c04c1d390" containerID="aa5c4932ba8990f0d13a11a5f8116494282ce66beadf47c3447a670008750592" exitCode=0 Sep 30 13:56:51 crc kubenswrapper[4936]: I0930 13:56:51.746802 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l85t5" event={"ID":"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390","Type":"ContainerDied","Data":"aa5c4932ba8990f0d13a11a5f8116494282ce66beadf47c3447a670008750592"} Sep 30 13:56:52 crc kubenswrapper[4936]: I0930 13:56:52.104606 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rwxb8" Sep 30 13:56:52 crc kubenswrapper[4936]: I0930 13:56:52.246261 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37485d56-bd76-442f-b986-00ccd913d8cb-config-data\") pod \"37485d56-bd76-442f-b986-00ccd913d8cb\" (UID: \"37485d56-bd76-442f-b986-00ccd913d8cb\") " Sep 30 13:56:52 crc kubenswrapper[4936]: I0930 13:56:52.246454 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wngbz\" (UniqueName: \"kubernetes.io/projected/37485d56-bd76-442f-b986-00ccd913d8cb-kube-api-access-wngbz\") pod \"37485d56-bd76-442f-b986-00ccd913d8cb\" (UID: \"37485d56-bd76-442f-b986-00ccd913d8cb\") " Sep 30 13:56:52 crc kubenswrapper[4936]: I0930 13:56:52.246627 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37485d56-bd76-442f-b986-00ccd913d8cb-combined-ca-bundle\") pod \"37485d56-bd76-442f-b986-00ccd913d8cb\" (UID: \"37485d56-bd76-442f-b986-00ccd913d8cb\") " Sep 30 13:56:52 crc kubenswrapper[4936]: I0930 13:56:52.261872 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37485d56-bd76-442f-b986-00ccd913d8cb-kube-api-access-wngbz" (OuterVolumeSpecName: "kube-api-access-wngbz") pod "37485d56-bd76-442f-b986-00ccd913d8cb" (UID: "37485d56-bd76-442f-b986-00ccd913d8cb"). InnerVolumeSpecName "kube-api-access-wngbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:52 crc kubenswrapper[4936]: I0930 13:56:52.278673 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37485d56-bd76-442f-b986-00ccd913d8cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37485d56-bd76-442f-b986-00ccd913d8cb" (UID: "37485d56-bd76-442f-b986-00ccd913d8cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:52 crc kubenswrapper[4936]: I0930 13:56:52.293601 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37485d56-bd76-442f-b986-00ccd913d8cb-config-data" (OuterVolumeSpecName: "config-data") pod "37485d56-bd76-442f-b986-00ccd913d8cb" (UID: "37485d56-bd76-442f-b986-00ccd913d8cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:52 crc kubenswrapper[4936]: I0930 13:56:52.353092 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wngbz\" (UniqueName: \"kubernetes.io/projected/37485d56-bd76-442f-b986-00ccd913d8cb-kube-api-access-wngbz\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:52 crc kubenswrapper[4936]: I0930 13:56:52.353149 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37485d56-bd76-442f-b986-00ccd913d8cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:52 crc kubenswrapper[4936]: I0930 13:56:52.353162 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37485d56-bd76-442f-b986-00ccd913d8cb-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:52 crc kubenswrapper[4936]: I0930 13:56:52.755433 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rwxb8" event={"ID":"37485d56-bd76-442f-b986-00ccd913d8cb","Type":"ContainerDied","Data":"afede658b10b2421176d730733f485bb9b253ec14e308cdcd870fed67d54f442"} Sep 30 13:56:52 crc kubenswrapper[4936]: I0930 13:56:52.755878 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afede658b10b2421176d730733f485bb9b253ec14e308cdcd870fed67d54f442" Sep 30 13:56:52 crc kubenswrapper[4936]: I0930 13:56:52.755596 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rwxb8" Sep 30 13:56:52 crc kubenswrapper[4936]: I0930 13:56:52.983767 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-tb4xm"] Sep 30 13:56:52 crc kubenswrapper[4936]: E0930 13:56:52.984098 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37485d56-bd76-442f-b986-00ccd913d8cb" containerName="keystone-db-sync" Sep 30 13:56:52 crc kubenswrapper[4936]: I0930 13:56:52.984114 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="37485d56-bd76-442f-b986-00ccd913d8cb" containerName="keystone-db-sync" Sep 30 13:56:52 crc kubenswrapper[4936]: I0930 13:56:52.984271 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="37485d56-bd76-442f-b986-00ccd913d8cb" containerName="keystone-db-sync" Sep 30 13:56:52 crc kubenswrapper[4936]: I0930 13:56:52.985139 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.043592 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-tb4xm"] Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.060596 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-hc4xm"] Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.061713 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.065052 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.065057 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-tb4xm\" (UID: \"c5f7c065-08bf-4f22-b9dc-7b060be11052\") " pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.065129 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-tb4xm\" (UID: \"c5f7c065-08bf-4f22-b9dc-7b060be11052\") " pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.065193 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-config\") pod \"dnsmasq-dns-75bb4695fc-tb4xm\" (UID: \"c5f7c065-08bf-4f22-b9dc-7b060be11052\") " pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.065211 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2vzp\" (UniqueName: \"kubernetes.io/projected/c5f7c065-08bf-4f22-b9dc-7b060be11052-kube-api-access-p2vzp\") pod \"dnsmasq-dns-75bb4695fc-tb4xm\" (UID: \"c5f7c065-08bf-4f22-b9dc-7b060be11052\") " pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.065261 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.065279 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-tb4xm\" (UID: \"c5f7c065-08bf-4f22-b9dc-7b060be11052\") " pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.066692 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jszhj" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.066979 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.105841 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hc4xm"] Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.168244 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-config-data\") pod \"keystone-bootstrap-hc4xm\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.168355 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-tb4xm\" (UID: \"c5f7c065-08bf-4f22-b9dc-7b060be11052\") " pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.168385 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-scripts\") pod \"keystone-bootstrap-hc4xm\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.168418 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvlc7\" (UniqueName: \"kubernetes.io/projected/0547aae6-aeaa-46df-8469-4772b411eb18-kube-api-access-qvlc7\") pod \"keystone-bootstrap-hc4xm\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.168446 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-tb4xm\" (UID: \"c5f7c065-08bf-4f22-b9dc-7b060be11052\") " pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.168505 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-config\") pod \"dnsmasq-dns-75bb4695fc-tb4xm\" (UID: \"c5f7c065-08bf-4f22-b9dc-7b060be11052\") " pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.168532 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2vzp\" (UniqueName: \"kubernetes.io/projected/c5f7c065-08bf-4f22-b9dc-7b060be11052-kube-api-access-p2vzp\") pod \"dnsmasq-dns-75bb4695fc-tb4xm\" (UID: \"c5f7c065-08bf-4f22-b9dc-7b060be11052\") " pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.168568 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-fernet-keys\") pod \"keystone-bootstrap-hc4xm\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.168593 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-combined-ca-bundle\") pod \"keystone-bootstrap-hc4xm\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.168634 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-tb4xm\" (UID: \"c5f7c065-08bf-4f22-b9dc-7b060be11052\") " pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.168659 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-credential-keys\") pod \"keystone-bootstrap-hc4xm\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.169878 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-config\") pod \"dnsmasq-dns-75bb4695fc-tb4xm\" (UID: \"c5f7c065-08bf-4f22-b9dc-7b060be11052\") " pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.171007 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-tb4xm\" (UID: \"c5f7c065-08bf-4f22-b9dc-7b060be11052\") " pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.171721 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-tb4xm\" (UID: \"c5f7c065-08bf-4f22-b9dc-7b060be11052\") " pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.182910 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-tb4xm\" (UID: \"c5f7c065-08bf-4f22-b9dc-7b060be11052\") " pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.211229 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2vzp\" (UniqueName: \"kubernetes.io/projected/c5f7c065-08bf-4f22-b9dc-7b060be11052-kube-api-access-p2vzp\") pod \"dnsmasq-dns-75bb4695fc-tb4xm\" (UID: \"c5f7c065-08bf-4f22-b9dc-7b060be11052\") " pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.269612 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-credential-keys\") pod \"keystone-bootstrap-hc4xm\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.269670 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-config-data\") pod \"keystone-bootstrap-hc4xm\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.269719 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-scripts\") pod \"keystone-bootstrap-hc4xm\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.269746 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvlc7\" (UniqueName: \"kubernetes.io/projected/0547aae6-aeaa-46df-8469-4772b411eb18-kube-api-access-qvlc7\") pod \"keystone-bootstrap-hc4xm\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.269816 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-fernet-keys\") pod \"keystone-bootstrap-hc4xm\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.269834 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-combined-ca-bundle\") pod \"keystone-bootstrap-hc4xm\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.282049 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-fernet-keys\") pod \"keystone-bootstrap-hc4xm\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.285242 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-config-data\") pod \"keystone-bootstrap-hc4xm\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.287159 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-scripts\") pod \"keystone-bootstrap-hc4xm\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.288251 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-combined-ca-bundle\") pod \"keystone-bootstrap-hc4xm\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.294693 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-credential-keys\") pod \"keystone-bootstrap-hc4xm\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.326479 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvlc7\" (UniqueName: \"kubernetes.io/projected/0547aae6-aeaa-46df-8469-4772b411eb18-kube-api-access-qvlc7\") pod \"keystone-bootstrap-hc4xm\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.347923 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.405945 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.413798 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-d9lhn"] Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.441384 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-d9lhn" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.450573 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-d9lhn"] Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.451035 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-t7snr" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.451275 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.451697 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.483032 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9507-account-create-fkds9" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.524460 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-08c3-account-create-cbk2n" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.547206 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-tb4xm"] Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.578825 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqjsz\" (UniqueName: \"kubernetes.io/projected/660cc428-d532-48bb-8272-d698d7b4b8db-kube-api-access-pqjsz\") pod \"660cc428-d532-48bb-8272-d698d7b4b8db\" (UID: \"660cc428-d532-48bb-8272-d698d7b4b8db\") " Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.579288 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6xqp\" (UniqueName: \"kubernetes.io/projected/9bb49398-578d-45ba-beee-c10d7ecd9b37-kube-api-access-m6xqp\") pod \"9bb49398-578d-45ba-beee-c10d7ecd9b37\" (UID: \"9bb49398-578d-45ba-beee-c10d7ecd9b37\") " Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.579562 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsz2n\" (UniqueName: \"kubernetes.io/projected/870f7b12-4944-4889-92bc-17f413d6ab36-kube-api-access-rsz2n\") pod \"placement-db-sync-d9lhn\" (UID: \"870f7b12-4944-4889-92bc-17f413d6ab36\") " pod="openstack/placement-db-sync-d9lhn" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.579639 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/870f7b12-4944-4889-92bc-17f413d6ab36-logs\") pod \"placement-db-sync-d9lhn\" (UID: \"870f7b12-4944-4889-92bc-17f413d6ab36\") " pod="openstack/placement-db-sync-d9lhn" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.579722 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870f7b12-4944-4889-92bc-17f413d6ab36-combined-ca-bundle\") pod \"placement-db-sync-d9lhn\" (UID: \"870f7b12-4944-4889-92bc-17f413d6ab36\") " pod="openstack/placement-db-sync-d9lhn" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.579761 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870f7b12-4944-4889-92bc-17f413d6ab36-config-data\") pod \"placement-db-sync-d9lhn\" (UID: \"870f7b12-4944-4889-92bc-17f413d6ab36\") " pod="openstack/placement-db-sync-d9lhn" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.579807 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/870f7b12-4944-4889-92bc-17f413d6ab36-scripts\") pod \"placement-db-sync-d9lhn\" (UID: \"870f7b12-4944-4889-92bc-17f413d6ab36\") " pod="openstack/placement-db-sync-d9lhn" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.605576 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bb49398-578d-45ba-beee-c10d7ecd9b37-kube-api-access-m6xqp" (OuterVolumeSpecName: "kube-api-access-m6xqp") pod "9bb49398-578d-45ba-beee-c10d7ecd9b37" (UID: "9bb49398-578d-45ba-beee-c10d7ecd9b37"). InnerVolumeSpecName "kube-api-access-m6xqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.605828 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/660cc428-d532-48bb-8272-d698d7b4b8db-kube-api-access-pqjsz" (OuterVolumeSpecName: "kube-api-access-pqjsz") pod "660cc428-d532-48bb-8272-d698d7b4b8db" (UID: "660cc428-d532-48bb-8272-d698d7b4b8db"). InnerVolumeSpecName "kube-api-access-pqjsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.616683 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-lgmpr"] Sep 30 13:56:53 crc kubenswrapper[4936]: E0930 13:56:53.617083 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="660cc428-d532-48bb-8272-d698d7b4b8db" containerName="mariadb-account-create" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.617104 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="660cc428-d532-48bb-8272-d698d7b4b8db" containerName="mariadb-account-create" Sep 30 13:56:53 crc kubenswrapper[4936]: E0930 13:56:53.617150 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb49398-578d-45ba-beee-c10d7ecd9b37" containerName="mariadb-account-create" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.617158 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb49398-578d-45ba-beee-c10d7ecd9b37" containerName="mariadb-account-create" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.617376 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="660cc428-d532-48bb-8272-d698d7b4b8db" containerName="mariadb-account-create" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.617393 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bb49398-578d-45ba-beee-c10d7ecd9b37" containerName="mariadb-account-create" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.618064 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.620754 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6pd4m" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.621292 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.621615 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.681627 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsz2n\" (UniqueName: \"kubernetes.io/projected/870f7b12-4944-4889-92bc-17f413d6ab36-kube-api-access-rsz2n\") pod \"placement-db-sync-d9lhn\" (UID: \"870f7b12-4944-4889-92bc-17f413d6ab36\") " pod="openstack/placement-db-sync-d9lhn" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.681692 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-combined-ca-bundle\") pod \"cinder-db-sync-lgmpr\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.681767 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/870f7b12-4944-4889-92bc-17f413d6ab36-logs\") pod \"placement-db-sync-d9lhn\" (UID: \"870f7b12-4944-4889-92bc-17f413d6ab36\") " pod="openstack/placement-db-sync-d9lhn" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.681795 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-db-sync-config-data\") pod \"cinder-db-sync-lgmpr\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.681834 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c1a55d69-0992-4eb2-a974-f4eedb0bf989-etc-machine-id\") pod \"cinder-db-sync-lgmpr\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.681898 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-scripts\") pod \"cinder-db-sync-lgmpr\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.681921 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870f7b12-4944-4889-92bc-17f413d6ab36-combined-ca-bundle\") pod \"placement-db-sync-d9lhn\" (UID: \"870f7b12-4944-4889-92bc-17f413d6ab36\") " pod="openstack/placement-db-sync-d9lhn" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.681947 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-config-data\") pod \"cinder-db-sync-lgmpr\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.681977 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870f7b12-4944-4889-92bc-17f413d6ab36-config-data\") pod \"placement-db-sync-d9lhn\" (UID: \"870f7b12-4944-4889-92bc-17f413d6ab36\") " pod="openstack/placement-db-sync-d9lhn" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.682001 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95xt4\" (UniqueName: \"kubernetes.io/projected/c1a55d69-0992-4eb2-a974-f4eedb0bf989-kube-api-access-95xt4\") pod \"cinder-db-sync-lgmpr\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.682045 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/870f7b12-4944-4889-92bc-17f413d6ab36-scripts\") pod \"placement-db-sync-d9lhn\" (UID: \"870f7b12-4944-4889-92bc-17f413d6ab36\") " pod="openstack/placement-db-sync-d9lhn" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.682139 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6xqp\" (UniqueName: \"kubernetes.io/projected/9bb49398-578d-45ba-beee-c10d7ecd9b37-kube-api-access-m6xqp\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.682152 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqjsz\" (UniqueName: \"kubernetes.io/projected/660cc428-d532-48bb-8272-d698d7b4b8db-kube-api-access-pqjsz\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.683327 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/870f7b12-4944-4889-92bc-17f413d6ab36-logs\") pod \"placement-db-sync-d9lhn\" (UID: \"870f7b12-4944-4889-92bc-17f413d6ab36\") " pod="openstack/placement-db-sync-d9lhn" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.687177 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870f7b12-4944-4889-92bc-17f413d6ab36-config-data\") pod \"placement-db-sync-d9lhn\" (UID: \"870f7b12-4944-4889-92bc-17f413d6ab36\") " pod="openstack/placement-db-sync-d9lhn" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.696937 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870f7b12-4944-4889-92bc-17f413d6ab36-combined-ca-bundle\") pod \"placement-db-sync-d9lhn\" (UID: \"870f7b12-4944-4889-92bc-17f413d6ab36\") " pod="openstack/placement-db-sync-d9lhn" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.700847 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/870f7b12-4944-4889-92bc-17f413d6ab36-scripts\") pod \"placement-db-sync-d9lhn\" (UID: \"870f7b12-4944-4889-92bc-17f413d6ab36\") " pod="openstack/placement-db-sync-d9lhn" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.702105 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-llvtt"] Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.704397 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsz2n\" (UniqueName: \"kubernetes.io/projected/870f7b12-4944-4889-92bc-17f413d6ab36-kube-api-access-rsz2n\") pod \"placement-db-sync-d9lhn\" (UID: \"870f7b12-4944-4889-92bc-17f413d6ab36\") " pod="openstack/placement-db-sync-d9lhn" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.706939 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.712752 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-lgmpr"] Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.774629 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l85t5" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.775948 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9507-account-create-fkds9" event={"ID":"660cc428-d532-48bb-8272-d698d7b4b8db","Type":"ContainerDied","Data":"27cbbbec5285091d1856a259fa77ae706d16d334c67ce492d598025ac7ec2c70"} Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.775973 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27cbbbec5285091d1856a259fa77ae706d16d334c67ce492d598025ac7ec2c70" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.776019 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9507-account-create-fkds9" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.796110 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkbpk\" (UniqueName: \"kubernetes.io/projected/408fde0b-6fce-4887-8fe2-e9cb535c20e2-kube-api-access-hkbpk\") pod \"dnsmasq-dns-745b9ddc8c-llvtt\" (UID: \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\") " pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.796157 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-db-sync-config-data\") pod \"cinder-db-sync-lgmpr\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.796193 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c1a55d69-0992-4eb2-a974-f4eedb0bf989-etc-machine-id\") pod \"cinder-db-sync-lgmpr\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.796213 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-ovsdbserver-nb\") pod \"dnsmasq-dns-745b9ddc8c-llvtt\" (UID: \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\") " pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.796248 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-scripts\") pod \"cinder-db-sync-lgmpr\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.796269 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-config-data\") pod \"cinder-db-sync-lgmpr\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.796289 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95xt4\" (UniqueName: \"kubernetes.io/projected/c1a55d69-0992-4eb2-a974-f4eedb0bf989-kube-api-access-95xt4\") pod \"cinder-db-sync-lgmpr\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.796541 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-ovsdbserver-sb\") pod \"dnsmasq-dns-745b9ddc8c-llvtt\" (UID: \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\") " pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.796567 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-dns-svc\") pod \"dnsmasq-dns-745b9ddc8c-llvtt\" (UID: \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\") " pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.796602 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-combined-ca-bundle\") pod \"cinder-db-sync-lgmpr\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.796622 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-config\") pod \"dnsmasq-dns-745b9ddc8c-llvtt\" (UID: \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\") " pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.800944 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c1a55d69-0992-4eb2-a974-f4eedb0bf989-etc-machine-id\") pod \"cinder-db-sync-lgmpr\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.802003 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-llvtt"] Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.806930 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-08c3-account-create-cbk2n" event={"ID":"9bb49398-578d-45ba-beee-c10d7ecd9b37","Type":"ContainerDied","Data":"a302d2bc3182ad8db0bf960dbd2c046147abf89157c7a46b09a1c374c7be30f1"} Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.807102 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-08c3-account-create-cbk2n" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.807251 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a302d2bc3182ad8db0bf960dbd2c046147abf89157c7a46b09a1c374c7be30f1" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.814773 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l85t5" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.814768 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l85t5" event={"ID":"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390","Type":"ContainerDied","Data":"1177032b9324ac9d406e731319de0ff58ab01479f444ac0325dca469f8f79ca5"} Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.815167 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1177032b9324ac9d406e731319de0ff58ab01479f444ac0325dca469f8f79ca5" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.839375 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-d9lhn" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.842887 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95xt4\" (UniqueName: \"kubernetes.io/projected/c1a55d69-0992-4eb2-a974-f4eedb0bf989-kube-api-access-95xt4\") pod \"cinder-db-sync-lgmpr\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.846735 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-combined-ca-bundle\") pod \"cinder-db-sync-lgmpr\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.869271 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-db-sync-config-data\") pod \"cinder-db-sync-lgmpr\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.870598 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-scripts\") pod \"cinder-db-sync-lgmpr\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.872262 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-config-data\") pod \"cinder-db-sync-lgmpr\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.878624 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:56:53 crc kubenswrapper[4936]: E0930 13:56:53.879547 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b5a2293-5cc6-4ee1-92dd-d63c04c1d390" containerName="glance-db-sync" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.879586 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b5a2293-5cc6-4ee1-92dd-d63c04c1d390" containerName="glance-db-sync" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.879794 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b5a2293-5cc6-4ee1-92dd-d63c04c1d390" containerName="glance-db-sync" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.885938 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.893168 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.893453 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.899688 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.900784 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-db-sync-config-data\") pod \"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390\" (UID: \"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390\") " Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.900909 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-config-data\") pod \"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390\" (UID: \"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390\") " Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.901129 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-combined-ca-bundle\") pod \"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390\" (UID: \"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390\") " Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.901229 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pfr6\" (UniqueName: \"kubernetes.io/projected/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-kube-api-access-8pfr6\") pod \"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390\" (UID: \"6b5a2293-5cc6-4ee1-92dd-d63c04c1d390\") " Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.901702 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-ovsdbserver-sb\") pod \"dnsmasq-dns-745b9ddc8c-llvtt\" (UID: \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\") " pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.901801 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-dns-svc\") pod \"dnsmasq-dns-745b9ddc8c-llvtt\" (UID: \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\") " pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.901968 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-config\") pod \"dnsmasq-dns-745b9ddc8c-llvtt\" (UID: \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\") " pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.902159 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkbpk\" (UniqueName: \"kubernetes.io/projected/408fde0b-6fce-4887-8fe2-e9cb535c20e2-kube-api-access-hkbpk\") pod \"dnsmasq-dns-745b9ddc8c-llvtt\" (UID: \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\") " pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.902355 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-ovsdbserver-nb\") pod \"dnsmasq-dns-745b9ddc8c-llvtt\" (UID: \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\") " pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.906552 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-ovsdbserver-sb\") pod \"dnsmasq-dns-745b9ddc8c-llvtt\" (UID: \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\") " pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.907031 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-ovsdbserver-nb\") pod \"dnsmasq-dns-745b9ddc8c-llvtt\" (UID: \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\") " pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.921241 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-dns-svc\") pod \"dnsmasq-dns-745b9ddc8c-llvtt\" (UID: \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\") " pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.928747 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-config\") pod \"dnsmasq-dns-745b9ddc8c-llvtt\" (UID: \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\") " pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.931178 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-kube-api-access-8pfr6" (OuterVolumeSpecName: "kube-api-access-8pfr6") pod "6b5a2293-5cc6-4ee1-92dd-d63c04c1d390" (UID: "6b5a2293-5cc6-4ee1-92dd-d63c04c1d390"). InnerVolumeSpecName "kube-api-access-8pfr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.948590 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6b5a2293-5cc6-4ee1-92dd-d63c04c1d390" (UID: "6b5a2293-5cc6-4ee1-92dd-d63c04c1d390"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:53 crc kubenswrapper[4936]: I0930 13:56:53.980799 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkbpk\" (UniqueName: \"kubernetes.io/projected/408fde0b-6fce-4887-8fe2-e9cb535c20e2-kube-api-access-hkbpk\") pod \"dnsmasq-dns-745b9ddc8c-llvtt\" (UID: \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\") " pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.005020 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba837426-6b1f-4298-899e-44c286d74708-log-httpd\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.005072 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba837426-6b1f-4298-899e-44c286d74708-run-httpd\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.005154 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5zhk\" (UniqueName: \"kubernetes.io/projected/ba837426-6b1f-4298-899e-44c286d74708-kube-api-access-w5zhk\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.005192 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-scripts\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.005223 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-config-data\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.005241 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.005259 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.005363 4936 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.005381 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pfr6\" (UniqueName: \"kubernetes.io/projected/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-kube-api-access-8pfr6\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.018759 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b5a2293-5cc6-4ee1-92dd-d63c04c1d390" (UID: "6b5a2293-5cc6-4ee1-92dd-d63c04c1d390"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.046845 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-config-data" (OuterVolumeSpecName: "config-data") pod "6b5a2293-5cc6-4ee1-92dd-d63c04c1d390" (UID: "6b5a2293-5cc6-4ee1-92dd-d63c04c1d390"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.076783 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.106490 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-config-data\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.106543 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.106567 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.106639 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba837426-6b1f-4298-899e-44c286d74708-log-httpd\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.106670 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba837426-6b1f-4298-899e-44c286d74708-run-httpd\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.106758 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5zhk\" (UniqueName: \"kubernetes.io/projected/ba837426-6b1f-4298-899e-44c286d74708-kube-api-access-w5zhk\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.106807 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-scripts\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.106885 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.106899 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.108302 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba837426-6b1f-4298-899e-44c286d74708-run-httpd\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.108413 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba837426-6b1f-4298-899e-44c286d74708-log-httpd\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.112192 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-scripts\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.113796 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-config-data\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.116649 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.117831 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.130532 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.154476 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5zhk\" (UniqueName: \"kubernetes.io/projected/ba837426-6b1f-4298-899e-44c286d74708-kube-api-access-w5zhk\") pod \"ceilometer-0\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.235052 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.399683 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hc4xm"] Sep 30 13:56:54 crc kubenswrapper[4936]: W0930 13:56:54.563947 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5f7c065_08bf_4f22_b9dc_7b060be11052.slice/crio-ac07fad179ff86f0cb77487531f377a274042c10dfd0d01fc21df0103fb27e6c WatchSource:0}: Error finding container ac07fad179ff86f0cb77487531f377a274042c10dfd0d01fc21df0103fb27e6c: Status 404 returned error can't find the container with id ac07fad179ff86f0cb77487531f377a274042c10dfd0d01fc21df0103fb27e6c Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.609664 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-tb4xm"] Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.827025 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-d9lhn"] Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.835160 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" event={"ID":"c5f7c065-08bf-4f22-b9dc-7b060be11052","Type":"ContainerStarted","Data":"ac07fad179ff86f0cb77487531f377a274042c10dfd0d01fc21df0103fb27e6c"} Sep 30 13:56:54 crc kubenswrapper[4936]: I0930 13:56:54.838685 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hc4xm" event={"ID":"0547aae6-aeaa-46df-8469-4772b411eb18","Type":"ContainerStarted","Data":"2cc97687863122c8d51c75a8d789bf80c038925d97690be7f763ff60f4367570"} Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.071271 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-lgmpr"] Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.087797 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-llvtt"] Sep 30 13:56:55 crc kubenswrapper[4936]: W0930 13:56:55.126638 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod408fde0b_6fce_4887_8fe2_e9cb535c20e2.slice/crio-b3a1699aed8ba7d1ad2cf5114577189002a182a64ef787e06c3a2cc24241feb8 WatchSource:0}: Error finding container b3a1699aed8ba7d1ad2cf5114577189002a182a64ef787e06c3a2cc24241feb8: Status 404 returned error can't find the container with id b3a1699aed8ba7d1ad2cf5114577189002a182a64ef787e06c3a2cc24241feb8 Sep 30 13:56:55 crc kubenswrapper[4936]: E0930 13:56:55.336348 4936 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5f7c065_08bf_4f22_b9dc_7b060be11052.slice/crio-47114a7251ee4592bcc5ec676f14d1f1fb3244b4b53c3cfc5ca294605f02b111.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.407584 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.466568 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-llvtt"] Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.513432 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-9c8pv"] Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.529049 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.536569 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-9c8pv"] Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.573913 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-9c8pv\" (UID: \"3f3b6274-31d7-4826-b554-0e1eadc5a811\") " pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.574175 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6n76\" (UniqueName: \"kubernetes.io/projected/3f3b6274-31d7-4826-b554-0e1eadc5a811-kube-api-access-t6n76\") pod \"dnsmasq-dns-7987f74bbc-9c8pv\" (UID: \"3f3b6274-31d7-4826-b554-0e1eadc5a811\") " pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.574267 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-9c8pv\" (UID: \"3f3b6274-31d7-4826-b554-0e1eadc5a811\") " pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.574376 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-config\") pod \"dnsmasq-dns-7987f74bbc-9c8pv\" (UID: \"3f3b6274-31d7-4826-b554-0e1eadc5a811\") " pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.574491 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-9c8pv\" (UID: \"3f3b6274-31d7-4826-b554-0e1eadc5a811\") " pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.625408 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-q4sjx"] Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.638820 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-q4sjx"] Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.638938 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q4sjx" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.642462 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-x72m7" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.645464 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.679774 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-config\") pod \"dnsmasq-dns-7987f74bbc-9c8pv\" (UID: \"3f3b6274-31d7-4826-b554-0e1eadc5a811\") " pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.679838 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-9c8pv\" (UID: \"3f3b6274-31d7-4826-b554-0e1eadc5a811\") " pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.679876 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-9c8pv\" (UID: \"3f3b6274-31d7-4826-b554-0e1eadc5a811\") " pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.679914 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8llsl\" (UniqueName: \"kubernetes.io/projected/f4b0ae70-9c0c-48af-8ad9-a226c9798c4a-kube-api-access-8llsl\") pod \"barbican-db-sync-q4sjx\" (UID: \"f4b0ae70-9c0c-48af-8ad9-a226c9798c4a\") " pod="openstack/barbican-db-sync-q4sjx" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.679975 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4b0ae70-9c0c-48af-8ad9-a226c9798c4a-db-sync-config-data\") pod \"barbican-db-sync-q4sjx\" (UID: \"f4b0ae70-9c0c-48af-8ad9-a226c9798c4a\") " pod="openstack/barbican-db-sync-q4sjx" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.680020 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6n76\" (UniqueName: \"kubernetes.io/projected/3f3b6274-31d7-4826-b554-0e1eadc5a811-kube-api-access-t6n76\") pod \"dnsmasq-dns-7987f74bbc-9c8pv\" (UID: \"3f3b6274-31d7-4826-b554-0e1eadc5a811\") " pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.680055 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b0ae70-9c0c-48af-8ad9-a226c9798c4a-combined-ca-bundle\") pod \"barbican-db-sync-q4sjx\" (UID: \"f4b0ae70-9c0c-48af-8ad9-a226c9798c4a\") " pod="openstack/barbican-db-sync-q4sjx" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.680087 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-9c8pv\" (UID: \"3f3b6274-31d7-4826-b554-0e1eadc5a811\") " pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.681016 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-9c8pv\" (UID: \"3f3b6274-31d7-4826-b554-0e1eadc5a811\") " pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.681796 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-config\") pod \"dnsmasq-dns-7987f74bbc-9c8pv\" (UID: \"3f3b6274-31d7-4826-b554-0e1eadc5a811\") " pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.682503 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-9c8pv\" (UID: \"3f3b6274-31d7-4826-b554-0e1eadc5a811\") " pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.682865 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-9c8pv\" (UID: \"3f3b6274-31d7-4826-b554-0e1eadc5a811\") " pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.719474 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6n76\" (UniqueName: \"kubernetes.io/projected/3f3b6274-31d7-4826-b554-0e1eadc5a811-kube-api-access-t6n76\") pod \"dnsmasq-dns-7987f74bbc-9c8pv\" (UID: \"3f3b6274-31d7-4826-b554-0e1eadc5a811\") " pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.784199 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4b0ae70-9c0c-48af-8ad9-a226c9798c4a-db-sync-config-data\") pod \"barbican-db-sync-q4sjx\" (UID: \"f4b0ae70-9c0c-48af-8ad9-a226c9798c4a\") " pod="openstack/barbican-db-sync-q4sjx" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.784304 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b0ae70-9c0c-48af-8ad9-a226c9798c4a-combined-ca-bundle\") pod \"barbican-db-sync-q4sjx\" (UID: \"f4b0ae70-9c0c-48af-8ad9-a226c9798c4a\") " pod="openstack/barbican-db-sync-q4sjx" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.784416 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8llsl\" (UniqueName: \"kubernetes.io/projected/f4b0ae70-9c0c-48af-8ad9-a226c9798c4a-kube-api-access-8llsl\") pod \"barbican-db-sync-q4sjx\" (UID: \"f4b0ae70-9c0c-48af-8ad9-a226c9798c4a\") " pod="openstack/barbican-db-sync-q4sjx" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.796881 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4b0ae70-9c0c-48af-8ad9-a226c9798c4a-db-sync-config-data\") pod \"barbican-db-sync-q4sjx\" (UID: \"f4b0ae70-9c0c-48af-8ad9-a226c9798c4a\") " pod="openstack/barbican-db-sync-q4sjx" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.804323 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b0ae70-9c0c-48af-8ad9-a226c9798c4a-combined-ca-bundle\") pod \"barbican-db-sync-q4sjx\" (UID: \"f4b0ae70-9c0c-48af-8ad9-a226c9798c4a\") " pod="openstack/barbican-db-sync-q4sjx" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.843803 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-zrdqm"] Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.863796 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8llsl\" (UniqueName: \"kubernetes.io/projected/f4b0ae70-9c0c-48af-8ad9-a226c9798c4a-kube-api-access-8llsl\") pod \"barbican-db-sync-q4sjx\" (UID: \"f4b0ae70-9c0c-48af-8ad9-a226c9798c4a\") " pod="openstack/barbican-db-sync-q4sjx" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.886812 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zrdqm" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.905880 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lgmpr" event={"ID":"c1a55d69-0992-4eb2-a974-f4eedb0bf989","Type":"ContainerStarted","Data":"de784979dddc7abc39e8e4853863e1aafc0ac0fa4c876bede69de5e89b968ebe"} Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.908442 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-zrdqm"] Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.909053 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.913427 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.913585 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ft2gs" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.913731 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.930288 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-d9lhn" event={"ID":"870f7b12-4944-4889-92bc-17f413d6ab36","Type":"ContainerStarted","Data":"9123970a303c814c5635406df8ee607e2b2504f12e788a9448e602bf3db286dd"} Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.969183 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q4sjx" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.973680 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hc4xm" event={"ID":"0547aae6-aeaa-46df-8469-4772b411eb18","Type":"ContainerStarted","Data":"71b9706e96c820d8304f3e7e22e31b2d176d7f00c9388602fd5a4adf8315eedc"} Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.984279 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba837426-6b1f-4298-899e-44c286d74708","Type":"ContainerStarted","Data":"6a7cf218f977e38442a687e972c4ae617923d7961aa237baf9e02cf50b189267"} Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.992404 4936 generic.go:334] "Generic (PLEG): container finished" podID="c5f7c065-08bf-4f22-b9dc-7b060be11052" containerID="47114a7251ee4592bcc5ec676f14d1f1fb3244b4b53c3cfc5ca294605f02b111" exitCode=0 Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.994038 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" event={"ID":"c5f7c065-08bf-4f22-b9dc-7b060be11052","Type":"ContainerDied","Data":"47114a7251ee4592bcc5ec676f14d1f1fb3244b4b53c3cfc5ca294605f02b111"} Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.999244 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5snt\" (UniqueName: \"kubernetes.io/projected/88e0e7bf-c7d6-4817-a3bc-77189570dfe6-kube-api-access-d5snt\") pod \"neutron-db-sync-zrdqm\" (UID: \"88e0e7bf-c7d6-4817-a3bc-77189570dfe6\") " pod="openstack/neutron-db-sync-zrdqm" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.999302 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e0e7bf-c7d6-4817-a3bc-77189570dfe6-combined-ca-bundle\") pod \"neutron-db-sync-zrdqm\" (UID: \"88e0e7bf-c7d6-4817-a3bc-77189570dfe6\") " pod="openstack/neutron-db-sync-zrdqm" Sep 30 13:56:55 crc kubenswrapper[4936]: I0930 13:56:55.999436 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/88e0e7bf-c7d6-4817-a3bc-77189570dfe6-config\") pod \"neutron-db-sync-zrdqm\" (UID: \"88e0e7bf-c7d6-4817-a3bc-77189570dfe6\") " pod="openstack/neutron-db-sync-zrdqm" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.010302 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-hc4xm" podStartSLOduration=3.010283547 podStartE2EDuration="3.010283547s" podCreationTimestamp="2025-09-30 13:56:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:56:55.987910156 +0000 UTC m=+1066.371912477" watchObservedRunningTime="2025-09-30 13:56:56.010283547 +0000 UTC m=+1066.394285848" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.022983 4936 generic.go:334] "Generic (PLEG): container finished" podID="408fde0b-6fce-4887-8fe2-e9cb535c20e2" containerID="161656f62bb629cf240763c50e2e8c2f198bd8ae6326408cd63d0ada1250b3b7" exitCode=0 Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.023046 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" event={"ID":"408fde0b-6fce-4887-8fe2-e9cb535c20e2","Type":"ContainerDied","Data":"161656f62bb629cf240763c50e2e8c2f198bd8ae6326408cd63d0ada1250b3b7"} Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.023080 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" event={"ID":"408fde0b-6fce-4887-8fe2-e9cb535c20e2","Type":"ContainerStarted","Data":"b3a1699aed8ba7d1ad2cf5114577189002a182a64ef787e06c3a2cc24241feb8"} Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.106267 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5snt\" (UniqueName: \"kubernetes.io/projected/88e0e7bf-c7d6-4817-a3bc-77189570dfe6-kube-api-access-d5snt\") pod \"neutron-db-sync-zrdqm\" (UID: \"88e0e7bf-c7d6-4817-a3bc-77189570dfe6\") " pod="openstack/neutron-db-sync-zrdqm" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.106325 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e0e7bf-c7d6-4817-a3bc-77189570dfe6-combined-ca-bundle\") pod \"neutron-db-sync-zrdqm\" (UID: \"88e0e7bf-c7d6-4817-a3bc-77189570dfe6\") " pod="openstack/neutron-db-sync-zrdqm" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.106560 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/88e0e7bf-c7d6-4817-a3bc-77189570dfe6-config\") pod \"neutron-db-sync-zrdqm\" (UID: \"88e0e7bf-c7d6-4817-a3bc-77189570dfe6\") " pod="openstack/neutron-db-sync-zrdqm" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.117719 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/88e0e7bf-c7d6-4817-a3bc-77189570dfe6-config\") pod \"neutron-db-sync-zrdqm\" (UID: \"88e0e7bf-c7d6-4817-a3bc-77189570dfe6\") " pod="openstack/neutron-db-sync-zrdqm" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.119057 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e0e7bf-c7d6-4817-a3bc-77189570dfe6-combined-ca-bundle\") pod \"neutron-db-sync-zrdqm\" (UID: \"88e0e7bf-c7d6-4817-a3bc-77189570dfe6\") " pod="openstack/neutron-db-sync-zrdqm" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.126253 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5snt\" (UniqueName: \"kubernetes.io/projected/88e0e7bf-c7d6-4817-a3bc-77189570dfe6-kube-api-access-d5snt\") pod \"neutron-db-sync-zrdqm\" (UID: \"88e0e7bf-c7d6-4817-a3bc-77189570dfe6\") " pod="openstack/neutron-db-sync-zrdqm" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.280457 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zrdqm" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.550777 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.702534 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.738691 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-config\") pod \"c5f7c065-08bf-4f22-b9dc-7b060be11052\" (UID: \"c5f7c065-08bf-4f22-b9dc-7b060be11052\") " Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.738809 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-dns-svc\") pod \"c5f7c065-08bf-4f22-b9dc-7b060be11052\" (UID: \"c5f7c065-08bf-4f22-b9dc-7b060be11052\") " Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.747525 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-ovsdbserver-sb\") pod \"c5f7c065-08bf-4f22-b9dc-7b060be11052\" (UID: \"c5f7c065-08bf-4f22-b9dc-7b060be11052\") " Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.747632 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-ovsdbserver-nb\") pod \"c5f7c065-08bf-4f22-b9dc-7b060be11052\" (UID: \"c5f7c065-08bf-4f22-b9dc-7b060be11052\") " Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.747654 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2vzp\" (UniqueName: \"kubernetes.io/projected/c5f7c065-08bf-4f22-b9dc-7b060be11052-kube-api-access-p2vzp\") pod \"c5f7c065-08bf-4f22-b9dc-7b060be11052\" (UID: \"c5f7c065-08bf-4f22-b9dc-7b060be11052\") " Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.765961 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-9c8pv"] Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.774409 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f7c065-08bf-4f22-b9dc-7b060be11052-kube-api-access-p2vzp" (OuterVolumeSpecName: "kube-api-access-p2vzp") pod "c5f7c065-08bf-4f22-b9dc-7b060be11052" (UID: "c5f7c065-08bf-4f22-b9dc-7b060be11052"). InnerVolumeSpecName "kube-api-access-p2vzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.788469 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-config" (OuterVolumeSpecName: "config") pod "c5f7c065-08bf-4f22-b9dc-7b060be11052" (UID: "c5f7c065-08bf-4f22-b9dc-7b060be11052"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.804616 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c5f7c065-08bf-4f22-b9dc-7b060be11052" (UID: "c5f7c065-08bf-4f22-b9dc-7b060be11052"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.812472 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c5f7c065-08bf-4f22-b9dc-7b060be11052" (UID: "c5f7c065-08bf-4f22-b9dc-7b060be11052"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.817265 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c5f7c065-08bf-4f22-b9dc-7b060be11052" (UID: "c5f7c065-08bf-4f22-b9dc-7b060be11052"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.850446 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.851503 4936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.851518 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.851528 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5f7c065-08bf-4f22-b9dc-7b060be11052-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.851537 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2vzp\" (UniqueName: \"kubernetes.io/projected/c5f7c065-08bf-4f22-b9dc-7b060be11052-kube-api-access-p2vzp\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.943232 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.946204 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-q4sjx"] Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.952703 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-ovsdbserver-nb\") pod \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\" (UID: \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\") " Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.952744 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-dns-svc\") pod \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\" (UID: \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\") " Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.952795 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkbpk\" (UniqueName: \"kubernetes.io/projected/408fde0b-6fce-4887-8fe2-e9cb535c20e2-kube-api-access-hkbpk\") pod \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\" (UID: \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\") " Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.952843 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-ovsdbserver-sb\") pod \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\" (UID: \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\") " Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.952893 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-config\") pod \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\" (UID: \"408fde0b-6fce-4887-8fe2-e9cb535c20e2\") " Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.981546 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "408fde0b-6fce-4887-8fe2-e9cb535c20e2" (UID: "408fde0b-6fce-4887-8fe2-e9cb535c20e2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:56:56 crc kubenswrapper[4936]: I0930 13:56:56.990166 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/408fde0b-6fce-4887-8fe2-e9cb535c20e2-kube-api-access-hkbpk" (OuterVolumeSpecName: "kube-api-access-hkbpk") pod "408fde0b-6fce-4887-8fe2-e9cb535c20e2" (UID: "408fde0b-6fce-4887-8fe2-e9cb535c20e2"). InnerVolumeSpecName "kube-api-access-hkbpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.011982 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "408fde0b-6fce-4887-8fe2-e9cb535c20e2" (UID: "408fde0b-6fce-4887-8fe2-e9cb535c20e2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.043064 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "408fde0b-6fce-4887-8fe2-e9cb535c20e2" (UID: "408fde0b-6fce-4887-8fe2-e9cb535c20e2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.054486 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkbpk\" (UniqueName: \"kubernetes.io/projected/408fde0b-6fce-4887-8fe2-e9cb535c20e2-kube-api-access-hkbpk\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.054689 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.054798 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.054861 4936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.057446 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-config" (OuterVolumeSpecName: "config") pod "408fde0b-6fce-4887-8fe2-e9cb535c20e2" (UID: "408fde0b-6fce-4887-8fe2-e9cb535c20e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.066866 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.068242 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-llvtt" event={"ID":"408fde0b-6fce-4887-8fe2-e9cb535c20e2","Type":"ContainerDied","Data":"b3a1699aed8ba7d1ad2cf5114577189002a182a64ef787e06c3a2cc24241feb8"} Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.068301 4936 scope.go:117] "RemoveContainer" containerID="161656f62bb629cf240763c50e2e8c2f198bd8ae6326408cd63d0ada1250b3b7" Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.084597 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" event={"ID":"3f3b6274-31d7-4826-b554-0e1eadc5a811","Type":"ContainerStarted","Data":"5db9680b4104135b4b0d34a22bd1a8b52846110977b2257e903585b1391cdf1f"} Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.106655 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q4sjx" event={"ID":"f4b0ae70-9c0c-48af-8ad9-a226c9798c4a","Type":"ContainerStarted","Data":"ccef8cfe9330b048e63210cc44c60eb8ef6d7c9b6c8301b077bfbc35aa4cfed5"} Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.113168 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" event={"ID":"c5f7c065-08bf-4f22-b9dc-7b060be11052","Type":"ContainerDied","Data":"ac07fad179ff86f0cb77487531f377a274042c10dfd0d01fc21df0103fb27e6c"} Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.113240 4936 scope.go:117] "RemoveContainer" containerID="47114a7251ee4592bcc5ec676f14d1f1fb3244b4b53c3cfc5ca294605f02b111" Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.113381 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-tb4xm" Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.156237 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408fde0b-6fce-4887-8fe2-e9cb535c20e2-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.182122 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-llvtt"] Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.195192 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-llvtt"] Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.251912 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-tb4xm"] Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.261300 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-tb4xm"] Sep 30 13:56:57 crc kubenswrapper[4936]: I0930 13:56:57.396416 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-zrdqm"] Sep 30 13:56:57 crc kubenswrapper[4936]: W0930 13:56:57.427590 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88e0e7bf_c7d6_4817_a3bc_77189570dfe6.slice/crio-2847ae13895f8b1f9636e63c336ded5a8f4ad05e3dd46335bf0710731d8755ea WatchSource:0}: Error finding container 2847ae13895f8b1f9636e63c336ded5a8f4ad05e3dd46335bf0710731d8755ea: Status 404 returned error can't find the container with id 2847ae13895f8b1f9636e63c336ded5a8f4ad05e3dd46335bf0710731d8755ea Sep 30 13:56:58 crc kubenswrapper[4936]: I0930 13:56:58.146073 4936 generic.go:334] "Generic (PLEG): container finished" podID="3f3b6274-31d7-4826-b554-0e1eadc5a811" containerID="7c7eaf225b1c2939e3bf6b2201713b3ad9d07721bf6c46cdf1b7c9c3b61d33d3" exitCode=0 Sep 30 13:56:58 crc kubenswrapper[4936]: I0930 13:56:58.146182 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" event={"ID":"3f3b6274-31d7-4826-b554-0e1eadc5a811","Type":"ContainerDied","Data":"7c7eaf225b1c2939e3bf6b2201713b3ad9d07721bf6c46cdf1b7c9c3b61d33d3"} Sep 30 13:56:58 crc kubenswrapper[4936]: I0930 13:56:58.166542 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zrdqm" event={"ID":"88e0e7bf-c7d6-4817-a3bc-77189570dfe6","Type":"ContainerStarted","Data":"484841fc4e2d47a267c56d0dc269d056bd589e6bf76c5d44edb85e78197a06c4"} Sep 30 13:56:58 crc kubenswrapper[4936]: I0930 13:56:58.166852 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zrdqm" event={"ID":"88e0e7bf-c7d6-4817-a3bc-77189570dfe6","Type":"ContainerStarted","Data":"2847ae13895f8b1f9636e63c336ded5a8f4ad05e3dd46335bf0710731d8755ea"} Sep 30 13:56:58 crc kubenswrapper[4936]: I0930 13:56:58.200626 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-zrdqm" podStartSLOduration=3.200605518 podStartE2EDuration="3.200605518s" podCreationTimestamp="2025-09-30 13:56:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:56:58.19010114 +0000 UTC m=+1068.574103441" watchObservedRunningTime="2025-09-30 13:56:58.200605518 +0000 UTC m=+1068.584607819" Sep 30 13:56:58 crc kubenswrapper[4936]: I0930 13:56:58.329138 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="408fde0b-6fce-4887-8fe2-e9cb535c20e2" path="/var/lib/kubelet/pods/408fde0b-6fce-4887-8fe2-e9cb535c20e2/volumes" Sep 30 13:56:58 crc kubenswrapper[4936]: I0930 13:56:58.359575 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f7c065-08bf-4f22-b9dc-7b060be11052" path="/var/lib/kubelet/pods/c5f7c065-08bf-4f22-b9dc-7b060be11052/volumes" Sep 30 13:56:59 crc kubenswrapper[4936]: I0930 13:56:59.176553 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" event={"ID":"3f3b6274-31d7-4826-b554-0e1eadc5a811","Type":"ContainerStarted","Data":"abf43715c04e37acc7230629354fa056b87f84678f8bd91800de34a00a2a92d7"} Sep 30 13:56:59 crc kubenswrapper[4936]: I0930 13:56:59.176708 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:56:59 crc kubenswrapper[4936]: I0930 13:56:59.203493 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" podStartSLOduration=4.203476221 podStartE2EDuration="4.203476221s" podCreationTimestamp="2025-09-30 13:56:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:56:59.19650976 +0000 UTC m=+1069.580512071" watchObservedRunningTime="2025-09-30 13:56:59.203476221 +0000 UTC m=+1069.587478522" Sep 30 13:57:03 crc kubenswrapper[4936]: I0930 13:57:03.227370 4936 generic.go:334] "Generic (PLEG): container finished" podID="0547aae6-aeaa-46df-8469-4772b411eb18" containerID="71b9706e96c820d8304f3e7e22e31b2d176d7f00c9388602fd5a4adf8315eedc" exitCode=0 Sep 30 13:57:03 crc kubenswrapper[4936]: I0930 13:57:03.227444 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hc4xm" event={"ID":"0547aae6-aeaa-46df-8469-4772b411eb18","Type":"ContainerDied","Data":"71b9706e96c820d8304f3e7e22e31b2d176d7f00c9388602fd5a4adf8315eedc"} Sep 30 13:57:05 crc kubenswrapper[4936]: I0930 13:57:05.912174 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:57:05 crc kubenswrapper[4936]: I0930 13:57:05.969055 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-qxw5c"] Sep 30 13:57:05 crc kubenswrapper[4936]: I0930 13:57:05.970835 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" podUID="183c122f-9990-4f38-b78f-6b70607064d6" containerName="dnsmasq-dns" containerID="cri-o://4a818229adfc1278fe17e30ef112baa6b8d686bf1210579c5718ad984ff2a863" gracePeriod=10 Sep 30 13:57:06 crc kubenswrapper[4936]: I0930 13:57:06.258131 4936 generic.go:334] "Generic (PLEG): container finished" podID="183c122f-9990-4f38-b78f-6b70607064d6" containerID="4a818229adfc1278fe17e30ef112baa6b8d686bf1210579c5718ad984ff2a863" exitCode=0 Sep 30 13:57:06 crc kubenswrapper[4936]: I0930 13:57:06.258183 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" event={"ID":"183c122f-9990-4f38-b78f-6b70607064d6","Type":"ContainerDied","Data":"4a818229adfc1278fe17e30ef112baa6b8d686bf1210579c5718ad984ff2a863"} Sep 30 13:57:08 crc kubenswrapper[4936]: I0930 13:57:08.003204 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" podUID="183c122f-9990-4f38-b78f-6b70607064d6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Sep 30 13:57:18 crc kubenswrapper[4936]: I0930 13:57:18.003054 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" podUID="183c122f-9990-4f38-b78f-6b70607064d6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Sep 30 13:57:18 crc kubenswrapper[4936]: I0930 13:57:18.969985 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.077207 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-combined-ca-bundle\") pod \"0547aae6-aeaa-46df-8469-4772b411eb18\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.077284 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvlc7\" (UniqueName: \"kubernetes.io/projected/0547aae6-aeaa-46df-8469-4772b411eb18-kube-api-access-qvlc7\") pod \"0547aae6-aeaa-46df-8469-4772b411eb18\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.077443 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-config-data\") pod \"0547aae6-aeaa-46df-8469-4772b411eb18\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.077484 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-fernet-keys\") pod \"0547aae6-aeaa-46df-8469-4772b411eb18\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.077510 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-credential-keys\") pod \"0547aae6-aeaa-46df-8469-4772b411eb18\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.078082 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-scripts\") pod \"0547aae6-aeaa-46df-8469-4772b411eb18\" (UID: \"0547aae6-aeaa-46df-8469-4772b411eb18\") " Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.082946 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0547aae6-aeaa-46df-8469-4772b411eb18" (UID: "0547aae6-aeaa-46df-8469-4772b411eb18"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.095476 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-scripts" (OuterVolumeSpecName: "scripts") pod "0547aae6-aeaa-46df-8469-4772b411eb18" (UID: "0547aae6-aeaa-46df-8469-4772b411eb18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.106887 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0547aae6-aeaa-46df-8469-4772b411eb18-kube-api-access-qvlc7" (OuterVolumeSpecName: "kube-api-access-qvlc7") pod "0547aae6-aeaa-46df-8469-4772b411eb18" (UID: "0547aae6-aeaa-46df-8469-4772b411eb18"). InnerVolumeSpecName "kube-api-access-qvlc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.111595 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0547aae6-aeaa-46df-8469-4772b411eb18" (UID: "0547aae6-aeaa-46df-8469-4772b411eb18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.111810 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0547aae6-aeaa-46df-8469-4772b411eb18" (UID: "0547aae6-aeaa-46df-8469-4772b411eb18"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.145845 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-config-data" (OuterVolumeSpecName: "config-data") pod "0547aae6-aeaa-46df-8469-4772b411eb18" (UID: "0547aae6-aeaa-46df-8469-4772b411eb18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.180419 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.180454 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvlc7\" (UniqueName: \"kubernetes.io/projected/0547aae6-aeaa-46df-8469-4772b411eb18-kube-api-access-qvlc7\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.180466 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.180473 4936 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.180482 4936 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.180489 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0547aae6-aeaa-46df-8469-4772b411eb18-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.361049 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hc4xm" event={"ID":"0547aae6-aeaa-46df-8469-4772b411eb18","Type":"ContainerDied","Data":"2cc97687863122c8d51c75a8d789bf80c038925d97690be7f763ff60f4367570"} Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.361090 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cc97687863122c8d51c75a8d789bf80c038925d97690be7f763ff60f4367570" Sep 30 13:57:19 crc kubenswrapper[4936]: I0930 13:57:19.361207 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hc4xm" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.050112 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-hc4xm"] Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.062745 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-hc4xm"] Sep 30 13:57:20 crc kubenswrapper[4936]: E0930 13:57:20.066442 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Sep 30 13:57:20 crc kubenswrapper[4936]: E0930 13:57:20.066654 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rsz2n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-d9lhn_openstack(870f7b12-4944-4889-92bc-17f413d6ab36): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:57:20 crc kubenswrapper[4936]: E0930 13:57:20.067824 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-d9lhn" podUID="870f7b12-4944-4889-92bc-17f413d6ab36" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.149242 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-gwg87"] Sep 30 13:57:20 crc kubenswrapper[4936]: E0930 13:57:20.149725 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0547aae6-aeaa-46df-8469-4772b411eb18" containerName="keystone-bootstrap" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.149740 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="0547aae6-aeaa-46df-8469-4772b411eb18" containerName="keystone-bootstrap" Sep 30 13:57:20 crc kubenswrapper[4936]: E0930 13:57:20.149758 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="408fde0b-6fce-4887-8fe2-e9cb535c20e2" containerName="init" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.149765 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="408fde0b-6fce-4887-8fe2-e9cb535c20e2" containerName="init" Sep 30 13:57:20 crc kubenswrapper[4936]: E0930 13:57:20.149780 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f7c065-08bf-4f22-b9dc-7b060be11052" containerName="init" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.149787 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f7c065-08bf-4f22-b9dc-7b060be11052" containerName="init" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.149960 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="0547aae6-aeaa-46df-8469-4772b411eb18" containerName="keystone-bootstrap" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.149973 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="408fde0b-6fce-4887-8fe2-e9cb535c20e2" containerName="init" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.149997 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f7c065-08bf-4f22-b9dc-7b060be11052" containerName="init" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.150783 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.153855 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jszhj" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.154387 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.154671 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.154935 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.164099 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gwg87"] Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.334221 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-config-data\") pod \"keystone-bootstrap-gwg87\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.334637 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-fernet-keys\") pod \"keystone-bootstrap-gwg87\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.334688 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8xrt\" (UniqueName: \"kubernetes.io/projected/9dc404db-3484-40e8-8241-35b197b3f120-kube-api-access-c8xrt\") pod \"keystone-bootstrap-gwg87\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.334723 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-scripts\") pod \"keystone-bootstrap-gwg87\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.334739 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-credential-keys\") pod \"keystone-bootstrap-gwg87\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.334794 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-combined-ca-bundle\") pod \"keystone-bootstrap-gwg87\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.360742 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0547aae6-aeaa-46df-8469-4772b411eb18" path="/var/lib/kubelet/pods/0547aae6-aeaa-46df-8469-4772b411eb18/volumes" Sep 30 13:57:20 crc kubenswrapper[4936]: E0930 13:57:20.374147 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-d9lhn" podUID="870f7b12-4944-4889-92bc-17f413d6ab36" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.435852 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-combined-ca-bundle\") pod \"keystone-bootstrap-gwg87\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.435942 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-config-data\") pod \"keystone-bootstrap-gwg87\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.435970 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-fernet-keys\") pod \"keystone-bootstrap-gwg87\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.436000 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8xrt\" (UniqueName: \"kubernetes.io/projected/9dc404db-3484-40e8-8241-35b197b3f120-kube-api-access-c8xrt\") pod \"keystone-bootstrap-gwg87\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.436037 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-scripts\") pod \"keystone-bootstrap-gwg87\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.436058 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-credential-keys\") pod \"keystone-bootstrap-gwg87\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.441827 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-scripts\") pod \"keystone-bootstrap-gwg87\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.441956 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-combined-ca-bundle\") pod \"keystone-bootstrap-gwg87\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.442021 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-config-data\") pod \"keystone-bootstrap-gwg87\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.443072 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-fernet-keys\") pod \"keystone-bootstrap-gwg87\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.449415 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-credential-keys\") pod \"keystone-bootstrap-gwg87\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.454105 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8xrt\" (UniqueName: \"kubernetes.io/projected/9dc404db-3484-40e8-8241-35b197b3f120-kube-api-access-c8xrt\") pod \"keystone-bootstrap-gwg87\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:20 crc kubenswrapper[4936]: I0930 13:57:20.469500 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:23 crc kubenswrapper[4936]: I0930 13:57:23.004124 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" podUID="183c122f-9990-4f38-b78f-6b70607064d6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Sep 30 13:57:23 crc kubenswrapper[4936]: I0930 13:57:23.004725 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:57:28 crc kubenswrapper[4936]: I0930 13:57:28.005311 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" podUID="183c122f-9990-4f38-b78f-6b70607064d6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.209581 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.384001 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-ovsdbserver-sb\") pod \"183c122f-9990-4f38-b78f-6b70607064d6\" (UID: \"183c122f-9990-4f38-b78f-6b70607064d6\") " Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.384081 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-ovsdbserver-nb\") pod \"183c122f-9990-4f38-b78f-6b70607064d6\" (UID: \"183c122f-9990-4f38-b78f-6b70607064d6\") " Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.384121 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-dns-svc\") pod \"183c122f-9990-4f38-b78f-6b70607064d6\" (UID: \"183c122f-9990-4f38-b78f-6b70607064d6\") " Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.384826 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qztl7\" (UniqueName: \"kubernetes.io/projected/183c122f-9990-4f38-b78f-6b70607064d6-kube-api-access-qztl7\") pod \"183c122f-9990-4f38-b78f-6b70607064d6\" (UID: \"183c122f-9990-4f38-b78f-6b70607064d6\") " Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.384899 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-config\") pod \"183c122f-9990-4f38-b78f-6b70607064d6\" (UID: \"183c122f-9990-4f38-b78f-6b70607064d6\") " Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.389779 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/183c122f-9990-4f38-b78f-6b70607064d6-kube-api-access-qztl7" (OuterVolumeSpecName: "kube-api-access-qztl7") pod "183c122f-9990-4f38-b78f-6b70607064d6" (UID: "183c122f-9990-4f38-b78f-6b70607064d6"). InnerVolumeSpecName "kube-api-access-qztl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.434576 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "183c122f-9990-4f38-b78f-6b70607064d6" (UID: "183c122f-9990-4f38-b78f-6b70607064d6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.437171 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "183c122f-9990-4f38-b78f-6b70607064d6" (UID: "183c122f-9990-4f38-b78f-6b70607064d6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.444779 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "183c122f-9990-4f38-b78f-6b70607064d6" (UID: "183c122f-9990-4f38-b78f-6b70607064d6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.449307 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" event={"ID":"183c122f-9990-4f38-b78f-6b70607064d6","Type":"ContainerDied","Data":"cb0d7ac0f9515c593bf754d094800edf7ae1727321d17a125c1dccbe473b629c"} Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.453390 4936 scope.go:117] "RemoveContainer" containerID="4a818229adfc1278fe17e30ef112baa6b8d686bf1210579c5718ad984ff2a863" Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.449448 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.456774 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-config" (OuterVolumeSpecName: "config") pod "183c122f-9990-4f38-b78f-6b70607064d6" (UID: "183c122f-9990-4f38-b78f-6b70607064d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.486433 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.486471 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.486486 4936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.486495 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qztl7\" (UniqueName: \"kubernetes.io/projected/183c122f-9990-4f38-b78f-6b70607064d6-kube-api-access-qztl7\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.486504 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/183c122f-9990-4f38-b78f-6b70607064d6-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.782428 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-qxw5c"] Sep 30 13:57:29 crc kubenswrapper[4936]: I0930 13:57:29.789356 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-qxw5c"] Sep 30 13:57:30 crc kubenswrapper[4936]: I0930 13:57:30.326229 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="183c122f-9990-4f38-b78f-6b70607064d6" path="/var/lib/kubelet/pods/183c122f-9990-4f38-b78f-6b70607064d6/volumes" Sep 30 13:57:33 crc kubenswrapper[4936]: I0930 13:57:33.006595 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-qxw5c" podUID="183c122f-9990-4f38-b78f-6b70607064d6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Sep 30 13:57:33 crc kubenswrapper[4936]: E0930 13:57:33.502628 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Sep 30 13:57:33 crc kubenswrapper[4936]: E0930 13:57:33.502809 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8llsl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-q4sjx_openstack(f4b0ae70-9c0c-48af-8ad9-a226c9798c4a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:57:33 crc kubenswrapper[4936]: E0930 13:57:33.503981 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-q4sjx" podUID="f4b0ae70-9c0c-48af-8ad9-a226c9798c4a" Sep 30 13:57:34 crc kubenswrapper[4936]: E0930 13:57:34.489396 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-q4sjx" podUID="f4b0ae70-9c0c-48af-8ad9-a226c9798c4a" Sep 30 13:57:34 crc kubenswrapper[4936]: I0930 13:57:34.776271 4936 scope.go:117] "RemoveContainer" containerID="d0e9782fb59dbc1181eed5b75aa9f014202c363ac0131b190763062e3c7a6176" Sep 30 13:57:34 crc kubenswrapper[4936]: E0930 13:57:34.796463 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Sep 30 13:57:34 crc kubenswrapper[4936]: E0930 13:57:34.796683 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-95xt4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-lgmpr_openstack(c1a55d69-0992-4eb2-a974-f4eedb0bf989): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 13:57:34 crc kubenswrapper[4936]: E0930 13:57:34.797883 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-lgmpr" podUID="c1a55d69-0992-4eb2-a974-f4eedb0bf989" Sep 30 13:57:35 crc kubenswrapper[4936]: I0930 13:57:35.271210 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gwg87"] Sep 30 13:57:35 crc kubenswrapper[4936]: W0930 13:57:35.276604 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dc404db_3484_40e8_8241_35b197b3f120.slice/crio-69f5018e24871b006b834398260111ad5ffa90100eff9a98a127b1c7367277a6 WatchSource:0}: Error finding container 69f5018e24871b006b834398260111ad5ffa90100eff9a98a127b1c7367277a6: Status 404 returned error can't find the container with id 69f5018e24871b006b834398260111ad5ffa90100eff9a98a127b1c7367277a6 Sep 30 13:57:35 crc kubenswrapper[4936]: I0930 13:57:35.498786 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gwg87" event={"ID":"9dc404db-3484-40e8-8241-35b197b3f120","Type":"ContainerStarted","Data":"d4a338b18b03c1738b76b19946a9a2c759f4784c0e1cc7ba79df3b0dd27793a7"} Sep 30 13:57:35 crc kubenswrapper[4936]: I0930 13:57:35.498852 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gwg87" event={"ID":"9dc404db-3484-40e8-8241-35b197b3f120","Type":"ContainerStarted","Data":"69f5018e24871b006b834398260111ad5ffa90100eff9a98a127b1c7367277a6"} Sep 30 13:57:35 crc kubenswrapper[4936]: I0930 13:57:35.503387 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-d9lhn" event={"ID":"870f7b12-4944-4889-92bc-17f413d6ab36","Type":"ContainerStarted","Data":"61ed5268bcfb5e8028980a859627f7498b97207f38c169f71d935dd3efbde843"} Sep 30 13:57:35 crc kubenswrapper[4936]: I0930 13:57:35.507036 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba837426-6b1f-4298-899e-44c286d74708","Type":"ContainerStarted","Data":"37ba9804ac456bbddf58ef86b0306ed5b782283f0af76b031081b8742748470a"} Sep 30 13:57:35 crc kubenswrapper[4936]: E0930 13:57:35.510356 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-lgmpr" podUID="c1a55d69-0992-4eb2-a974-f4eedb0bf989" Sep 30 13:57:35 crc kubenswrapper[4936]: I0930 13:57:35.519866 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-gwg87" podStartSLOduration=15.519849451 podStartE2EDuration="15.519849451s" podCreationTimestamp="2025-09-30 13:57:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:57:35.517306988 +0000 UTC m=+1105.901309289" watchObservedRunningTime="2025-09-30 13:57:35.519849451 +0000 UTC m=+1105.903851752" Sep 30 13:57:35 crc kubenswrapper[4936]: I0930 13:57:35.563613 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-d9lhn" podStartSLOduration=2.5302612030000002 podStartE2EDuration="42.563595496s" podCreationTimestamp="2025-09-30 13:56:53 +0000 UTC" firstStartedPulling="2025-09-30 13:56:54.866519313 +0000 UTC m=+1065.250521614" lastFinishedPulling="2025-09-30 13:57:34.899853616 +0000 UTC m=+1105.283855907" observedRunningTime="2025-09-30 13:57:35.560616141 +0000 UTC m=+1105.944618442" watchObservedRunningTime="2025-09-30 13:57:35.563595496 +0000 UTC m=+1105.947597797" Sep 30 13:57:37 crc kubenswrapper[4936]: I0930 13:57:37.530654 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba837426-6b1f-4298-899e-44c286d74708","Type":"ContainerStarted","Data":"56a7ec7de306be8cf8af9522b1a952ec3b355e6f27ae61c987a24c26e37e8ada"} Sep 30 13:57:38 crc kubenswrapper[4936]: I0930 13:57:38.549534 4936 generic.go:334] "Generic (PLEG): container finished" podID="870f7b12-4944-4889-92bc-17f413d6ab36" containerID="61ed5268bcfb5e8028980a859627f7498b97207f38c169f71d935dd3efbde843" exitCode=0 Sep 30 13:57:38 crc kubenswrapper[4936]: I0930 13:57:38.549640 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-d9lhn" event={"ID":"870f7b12-4944-4889-92bc-17f413d6ab36","Type":"ContainerDied","Data":"61ed5268bcfb5e8028980a859627f7498b97207f38c169f71d935dd3efbde843"} Sep 30 13:57:40 crc kubenswrapper[4936]: I0930 13:57:40.567207 4936 generic.go:334] "Generic (PLEG): container finished" podID="9dc404db-3484-40e8-8241-35b197b3f120" containerID="d4a338b18b03c1738b76b19946a9a2c759f4784c0e1cc7ba79df3b0dd27793a7" exitCode=0 Sep 30 13:57:40 crc kubenswrapper[4936]: I0930 13:57:40.567494 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gwg87" event={"ID":"9dc404db-3484-40e8-8241-35b197b3f120","Type":"ContainerDied","Data":"d4a338b18b03c1738b76b19946a9a2c759f4784c0e1cc7ba79df3b0dd27793a7"} Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.376154 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-d9lhn" Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.545561 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870f7b12-4944-4889-92bc-17f413d6ab36-config-data\") pod \"870f7b12-4944-4889-92bc-17f413d6ab36\" (UID: \"870f7b12-4944-4889-92bc-17f413d6ab36\") " Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.545634 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/870f7b12-4944-4889-92bc-17f413d6ab36-logs\") pod \"870f7b12-4944-4889-92bc-17f413d6ab36\" (UID: \"870f7b12-4944-4889-92bc-17f413d6ab36\") " Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.545707 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsz2n\" (UniqueName: \"kubernetes.io/projected/870f7b12-4944-4889-92bc-17f413d6ab36-kube-api-access-rsz2n\") pod \"870f7b12-4944-4889-92bc-17f413d6ab36\" (UID: \"870f7b12-4944-4889-92bc-17f413d6ab36\") " Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.545750 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870f7b12-4944-4889-92bc-17f413d6ab36-combined-ca-bundle\") pod \"870f7b12-4944-4889-92bc-17f413d6ab36\" (UID: \"870f7b12-4944-4889-92bc-17f413d6ab36\") " Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.545783 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/870f7b12-4944-4889-92bc-17f413d6ab36-scripts\") pod \"870f7b12-4944-4889-92bc-17f413d6ab36\" (UID: \"870f7b12-4944-4889-92bc-17f413d6ab36\") " Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.546971 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/870f7b12-4944-4889-92bc-17f413d6ab36-logs" (OuterVolumeSpecName: "logs") pod "870f7b12-4944-4889-92bc-17f413d6ab36" (UID: "870f7b12-4944-4889-92bc-17f413d6ab36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.551739 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870f7b12-4944-4889-92bc-17f413d6ab36-scripts" (OuterVolumeSpecName: "scripts") pod "870f7b12-4944-4889-92bc-17f413d6ab36" (UID: "870f7b12-4944-4889-92bc-17f413d6ab36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.551834 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/870f7b12-4944-4889-92bc-17f413d6ab36-kube-api-access-rsz2n" (OuterVolumeSpecName: "kube-api-access-rsz2n") pod "870f7b12-4944-4889-92bc-17f413d6ab36" (UID: "870f7b12-4944-4889-92bc-17f413d6ab36"). InnerVolumeSpecName "kube-api-access-rsz2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.581672 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870f7b12-4944-4889-92bc-17f413d6ab36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "870f7b12-4944-4889-92bc-17f413d6ab36" (UID: "870f7b12-4944-4889-92bc-17f413d6ab36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.584132 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870f7b12-4944-4889-92bc-17f413d6ab36-config-data" (OuterVolumeSpecName: "config-data") pod "870f7b12-4944-4889-92bc-17f413d6ab36" (UID: "870f7b12-4944-4889-92bc-17f413d6ab36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.599913 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-d9lhn" Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.601665 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-d9lhn" event={"ID":"870f7b12-4944-4889-92bc-17f413d6ab36","Type":"ContainerDied","Data":"9123970a303c814c5635406df8ee607e2b2504f12e788a9448e602bf3db286dd"} Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.601701 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9123970a303c814c5635406df8ee607e2b2504f12e788a9448e602bf3db286dd" Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.647798 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/870f7b12-4944-4889-92bc-17f413d6ab36-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.648021 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870f7b12-4944-4889-92bc-17f413d6ab36-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.648099 4936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/870f7b12-4944-4889-92bc-17f413d6ab36-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.648156 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsz2n\" (UniqueName: \"kubernetes.io/projected/870f7b12-4944-4889-92bc-17f413d6ab36-kube-api-access-rsz2n\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.648215 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870f7b12-4944-4889-92bc-17f413d6ab36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.843044 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.953030 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-scripts\") pod \"9dc404db-3484-40e8-8241-35b197b3f120\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.953114 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-credential-keys\") pod \"9dc404db-3484-40e8-8241-35b197b3f120\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.953146 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-config-data\") pod \"9dc404db-3484-40e8-8241-35b197b3f120\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.953187 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-fernet-keys\") pod \"9dc404db-3484-40e8-8241-35b197b3f120\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.953251 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-combined-ca-bundle\") pod \"9dc404db-3484-40e8-8241-35b197b3f120\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.953278 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8xrt\" (UniqueName: \"kubernetes.io/projected/9dc404db-3484-40e8-8241-35b197b3f120-kube-api-access-c8xrt\") pod \"9dc404db-3484-40e8-8241-35b197b3f120\" (UID: \"9dc404db-3484-40e8-8241-35b197b3f120\") " Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.956650 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9dc404db-3484-40e8-8241-35b197b3f120" (UID: "9dc404db-3484-40e8-8241-35b197b3f120"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.957059 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-scripts" (OuterVolumeSpecName: "scripts") pod "9dc404db-3484-40e8-8241-35b197b3f120" (UID: "9dc404db-3484-40e8-8241-35b197b3f120"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.957091 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9dc404db-3484-40e8-8241-35b197b3f120" (UID: "9dc404db-3484-40e8-8241-35b197b3f120"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.958705 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dc404db-3484-40e8-8241-35b197b3f120-kube-api-access-c8xrt" (OuterVolumeSpecName: "kube-api-access-c8xrt") pod "9dc404db-3484-40e8-8241-35b197b3f120" (UID: "9dc404db-3484-40e8-8241-35b197b3f120"). InnerVolumeSpecName "kube-api-access-c8xrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.973187 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-config-data" (OuterVolumeSpecName: "config-data") pod "9dc404db-3484-40e8-8241-35b197b3f120" (UID: "9dc404db-3484-40e8-8241-35b197b3f120"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:41 crc kubenswrapper[4936]: I0930 13:57:41.973574 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9dc404db-3484-40e8-8241-35b197b3f120" (UID: "9dc404db-3484-40e8-8241-35b197b3f120"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.056109 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.056147 4936 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.056163 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.056174 4936 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.056188 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc404db-3484-40e8-8241-35b197b3f120-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.056200 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8xrt\" (UniqueName: \"kubernetes.io/projected/9dc404db-3484-40e8-8241-35b197b3f120-kube-api-access-c8xrt\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.500602 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-564967b568-8jhvh"] Sep 30 13:57:42 crc kubenswrapper[4936]: E0930 13:57:42.501182 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc404db-3484-40e8-8241-35b197b3f120" containerName="keystone-bootstrap" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.501197 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc404db-3484-40e8-8241-35b197b3f120" containerName="keystone-bootstrap" Sep 30 13:57:42 crc kubenswrapper[4936]: E0930 13:57:42.501230 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183c122f-9990-4f38-b78f-6b70607064d6" containerName="dnsmasq-dns" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.501236 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="183c122f-9990-4f38-b78f-6b70607064d6" containerName="dnsmasq-dns" Sep 30 13:57:42 crc kubenswrapper[4936]: E0930 13:57:42.501248 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183c122f-9990-4f38-b78f-6b70607064d6" containerName="init" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.501254 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="183c122f-9990-4f38-b78f-6b70607064d6" containerName="init" Sep 30 13:57:42 crc kubenswrapper[4936]: E0930 13:57:42.501264 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870f7b12-4944-4889-92bc-17f413d6ab36" containerName="placement-db-sync" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.501270 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="870f7b12-4944-4889-92bc-17f413d6ab36" containerName="placement-db-sync" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.501437 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="870f7b12-4944-4889-92bc-17f413d6ab36" containerName="placement-db-sync" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.501452 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="183c122f-9990-4f38-b78f-6b70607064d6" containerName="dnsmasq-dns" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.501461 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dc404db-3484-40e8-8241-35b197b3f120" containerName="keystone-bootstrap" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.502257 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.504547 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.504855 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.505087 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-t7snr" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.505284 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.505523 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.521394 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-564967b568-8jhvh"] Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.609036 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba837426-6b1f-4298-899e-44c286d74708","Type":"ContainerStarted","Data":"c4d5a5a03569aa07b9f104786803c05b629039fc4598e9437b203ba2266cdff6"} Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.611688 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gwg87" event={"ID":"9dc404db-3484-40e8-8241-35b197b3f120","Type":"ContainerDied","Data":"69f5018e24871b006b834398260111ad5ffa90100eff9a98a127b1c7367277a6"} Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.611815 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69f5018e24871b006b834398260111ad5ffa90100eff9a98a127b1c7367277a6" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.611830 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gwg87" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.664903 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/471904cd-677c-4409-b641-15d34de36dbe-config-data\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.664976 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/471904cd-677c-4409-b641-15d34de36dbe-public-tls-certs\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.665032 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/471904cd-677c-4409-b641-15d34de36dbe-internal-tls-certs\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.665087 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/471904cd-677c-4409-b641-15d34de36dbe-scripts\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.665119 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471904cd-677c-4409-b641-15d34de36dbe-combined-ca-bundle\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.665150 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/471904cd-677c-4409-b641-15d34de36dbe-logs\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.665183 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp8ml\" (UniqueName: \"kubernetes.io/projected/471904cd-677c-4409-b641-15d34de36dbe-kube-api-access-fp8ml\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.683250 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7f8b6d55dd-xrxpc"] Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.712667 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.717917 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.717970 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.718212 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.718223 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.718428 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jszhj" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.718486 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.744188 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7f8b6d55dd-xrxpc"] Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.766523 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/471904cd-677c-4409-b641-15d34de36dbe-scripts\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.766575 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471904cd-677c-4409-b641-15d34de36dbe-combined-ca-bundle\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.766606 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/471904cd-677c-4409-b641-15d34de36dbe-logs\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.766673 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp8ml\" (UniqueName: \"kubernetes.io/projected/471904cd-677c-4409-b641-15d34de36dbe-kube-api-access-fp8ml\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.766725 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/471904cd-677c-4409-b641-15d34de36dbe-config-data\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.766755 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/471904cd-677c-4409-b641-15d34de36dbe-public-tls-certs\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.766788 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/471904cd-677c-4409-b641-15d34de36dbe-internal-tls-certs\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.768092 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/471904cd-677c-4409-b641-15d34de36dbe-logs\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.771479 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/471904cd-677c-4409-b641-15d34de36dbe-scripts\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.773076 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/471904cd-677c-4409-b641-15d34de36dbe-config-data\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.774120 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/471904cd-677c-4409-b641-15d34de36dbe-public-tls-certs\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.782924 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471904cd-677c-4409-b641-15d34de36dbe-combined-ca-bundle\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.786775 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/471904cd-677c-4409-b641-15d34de36dbe-internal-tls-certs\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.788582 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp8ml\" (UniqueName: \"kubernetes.io/projected/471904cd-677c-4409-b641-15d34de36dbe-kube-api-access-fp8ml\") pod \"placement-564967b568-8jhvh\" (UID: \"471904cd-677c-4409-b641-15d34de36dbe\") " pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.818058 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.867693 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-internal-tls-certs\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.867890 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-public-tls-certs\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.867978 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-fernet-keys\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.868070 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-scripts\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.868150 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-combined-ca-bundle\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.868222 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-credential-keys\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.868316 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d88l\" (UniqueName: \"kubernetes.io/projected/57c8a1c9-ff09-4e19-90d2-e7552e497695-kube-api-access-6d88l\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.868454 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-config-data\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.973136 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-fernet-keys\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.974129 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-scripts\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.974210 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-credential-keys\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.974225 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-combined-ca-bundle\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.974577 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d88l\" (UniqueName: \"kubernetes.io/projected/57c8a1c9-ff09-4e19-90d2-e7552e497695-kube-api-access-6d88l\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.974752 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-config-data\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.974925 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-internal-tls-certs\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.974961 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-public-tls-certs\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.988825 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-credential-keys\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.988973 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-combined-ca-bundle\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.991300 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-internal-tls-certs\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.991897 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-config-data\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.993066 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-public-tls-certs\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.993946 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-fernet-keys\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:42 crc kubenswrapper[4936]: I0930 13:57:42.995593 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57c8a1c9-ff09-4e19-90d2-e7552e497695-scripts\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:43 crc kubenswrapper[4936]: I0930 13:57:43.009402 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d88l\" (UniqueName: \"kubernetes.io/projected/57c8a1c9-ff09-4e19-90d2-e7552e497695-kube-api-access-6d88l\") pod \"keystone-7f8b6d55dd-xrxpc\" (UID: \"57c8a1c9-ff09-4e19-90d2-e7552e497695\") " pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:43 crc kubenswrapper[4936]: I0930 13:57:43.033942 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:43 crc kubenswrapper[4936]: I0930 13:57:43.349092 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7f8b6d55dd-xrxpc"] Sep 30 13:57:43 crc kubenswrapper[4936]: I0930 13:57:43.371579 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-564967b568-8jhvh"] Sep 30 13:57:43 crc kubenswrapper[4936]: W0930 13:57:43.431040 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod471904cd_677c_4409_b641_15d34de36dbe.slice/crio-6f65f59e06e589bf1f431f306c9247add2a8e9e69a0a4cdfe59e2b1edadfba84 WatchSource:0}: Error finding container 6f65f59e06e589bf1f431f306c9247add2a8e9e69a0a4cdfe59e2b1edadfba84: Status 404 returned error can't find the container with id 6f65f59e06e589bf1f431f306c9247add2a8e9e69a0a4cdfe59e2b1edadfba84 Sep 30 13:57:43 crc kubenswrapper[4936]: I0930 13:57:43.633782 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-564967b568-8jhvh" event={"ID":"471904cd-677c-4409-b641-15d34de36dbe","Type":"ContainerStarted","Data":"6f65f59e06e589bf1f431f306c9247add2a8e9e69a0a4cdfe59e2b1edadfba84"} Sep 30 13:57:43 crc kubenswrapper[4936]: I0930 13:57:43.634873 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7f8b6d55dd-xrxpc" event={"ID":"57c8a1c9-ff09-4e19-90d2-e7552e497695","Type":"ContainerStarted","Data":"ff8755ce31f47f66f0770f03bb2dabfeb3681b927b8eb122e8c7c16caf0ac92a"} Sep 30 13:57:44 crc kubenswrapper[4936]: I0930 13:57:44.647319 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-564967b568-8jhvh" event={"ID":"471904cd-677c-4409-b641-15d34de36dbe","Type":"ContainerStarted","Data":"36b026efdadf3ea3955220d9e490da3f1f90b2a263da47a744b0376e229b27aa"} Sep 30 13:57:44 crc kubenswrapper[4936]: I0930 13:57:44.647656 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-564967b568-8jhvh" event={"ID":"471904cd-677c-4409-b641-15d34de36dbe","Type":"ContainerStarted","Data":"be69e623665185d5d6558ae1caa71e58b9470b3e18c16cfc4ec52d9c65f6edf5"} Sep 30 13:57:44 crc kubenswrapper[4936]: I0930 13:57:44.647982 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:44 crc kubenswrapper[4936]: I0930 13:57:44.648008 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-564967b568-8jhvh" Sep 30 13:57:44 crc kubenswrapper[4936]: I0930 13:57:44.655899 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7f8b6d55dd-xrxpc" event={"ID":"57c8a1c9-ff09-4e19-90d2-e7552e497695","Type":"ContainerStarted","Data":"2222a601ad37b784e8dc8cf7a4fb644ecb9a512ad39c0598a463df3ff419fed3"} Sep 30 13:57:44 crc kubenswrapper[4936]: I0930 13:57:44.655998 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:57:44 crc kubenswrapper[4936]: I0930 13:57:44.682773 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-564967b568-8jhvh" podStartSLOduration=2.682755915 podStartE2EDuration="2.682755915s" podCreationTimestamp="2025-09-30 13:57:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:57:44.671979321 +0000 UTC m=+1115.055981622" watchObservedRunningTime="2025-09-30 13:57:44.682755915 +0000 UTC m=+1115.066758216" Sep 30 13:57:44 crc kubenswrapper[4936]: I0930 13:57:44.699174 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7f8b6d55dd-xrxpc" podStartSLOduration=2.699156753 podStartE2EDuration="2.699156753s" podCreationTimestamp="2025-09-30 13:57:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:57:44.694458035 +0000 UTC m=+1115.078460336" watchObservedRunningTime="2025-09-30 13:57:44.699156753 +0000 UTC m=+1115.083159054" Sep 30 13:57:48 crc kubenswrapper[4936]: I0930 13:57:48.250653 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:57:48 crc kubenswrapper[4936]: I0930 13:57:48.250982 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:57:53 crc kubenswrapper[4936]: I0930 13:57:53.721876 4936 generic.go:334] "Generic (PLEG): container finished" podID="88e0e7bf-c7d6-4817-a3bc-77189570dfe6" containerID="484841fc4e2d47a267c56d0dc269d056bd589e6bf76c5d44edb85e78197a06c4" exitCode=0 Sep 30 13:57:53 crc kubenswrapper[4936]: I0930 13:57:53.721968 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zrdqm" event={"ID":"88e0e7bf-c7d6-4817-a3bc-77189570dfe6","Type":"ContainerDied","Data":"484841fc4e2d47a267c56d0dc269d056bd589e6bf76c5d44edb85e78197a06c4"} Sep 30 13:57:53 crc kubenswrapper[4936]: I0930 13:57:53.724383 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lgmpr" event={"ID":"c1a55d69-0992-4eb2-a974-f4eedb0bf989","Type":"ContainerStarted","Data":"e0f1eeaedaf8b51b4b84376e35fc277329272693637da3a56a225ba83381af14"} Sep 30 13:57:53 crc kubenswrapper[4936]: I0930 13:57:53.727856 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba837426-6b1f-4298-899e-44c286d74708","Type":"ContainerStarted","Data":"456a845ee1c0aceb7e497fbfcde966cc9401cee9cd99800d394a80ca766b4088"} Sep 30 13:57:53 crc kubenswrapper[4936]: I0930 13:57:53.727978 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba837426-6b1f-4298-899e-44c286d74708" containerName="ceilometer-central-agent" containerID="cri-o://37ba9804ac456bbddf58ef86b0306ed5b782283f0af76b031081b8742748470a" gracePeriod=30 Sep 30 13:57:53 crc kubenswrapper[4936]: I0930 13:57:53.728122 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 13:57:53 crc kubenswrapper[4936]: I0930 13:57:53.728164 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba837426-6b1f-4298-899e-44c286d74708" containerName="proxy-httpd" containerID="cri-o://456a845ee1c0aceb7e497fbfcde966cc9401cee9cd99800d394a80ca766b4088" gracePeriod=30 Sep 30 13:57:53 crc kubenswrapper[4936]: I0930 13:57:53.728202 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba837426-6b1f-4298-899e-44c286d74708" containerName="sg-core" containerID="cri-o://c4d5a5a03569aa07b9f104786803c05b629039fc4598e9437b203ba2266cdff6" gracePeriod=30 Sep 30 13:57:53 crc kubenswrapper[4936]: I0930 13:57:53.728232 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba837426-6b1f-4298-899e-44c286d74708" containerName="ceilometer-notification-agent" containerID="cri-o://56a7ec7de306be8cf8af9522b1a952ec3b355e6f27ae61c987a24c26e37e8ada" gracePeriod=30 Sep 30 13:57:53 crc kubenswrapper[4936]: I0930 13:57:53.733726 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q4sjx" event={"ID":"f4b0ae70-9c0c-48af-8ad9-a226c9798c4a","Type":"ContainerStarted","Data":"ac7828e1177620cec3c03e47abca450d452afe663f4eff809ef7c11404f88829"} Sep 30 13:57:53 crc kubenswrapper[4936]: I0930 13:57:53.783651 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.346035407 podStartE2EDuration="1m0.783635374s" podCreationTimestamp="2025-09-30 13:56:53 +0000 UTC" firstStartedPulling="2025-09-30 13:56:55.435105525 +0000 UTC m=+1065.819107826" lastFinishedPulling="2025-09-30 13:57:52.872705492 +0000 UTC m=+1123.256707793" observedRunningTime="2025-09-30 13:57:53.766827805 +0000 UTC m=+1124.150830126" watchObservedRunningTime="2025-09-30 13:57:53.783635374 +0000 UTC m=+1124.167637675" Sep 30 13:57:53 crc kubenswrapper[4936]: I0930 13:57:53.787420 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-lgmpr" podStartSLOduration=3.051262208 podStartE2EDuration="1m0.787410787s" podCreationTimestamp="2025-09-30 13:56:53 +0000 UTC" firstStartedPulling="2025-09-30 13:56:55.118084799 +0000 UTC m=+1065.502087100" lastFinishedPulling="2025-09-30 13:57:52.854233378 +0000 UTC m=+1123.238235679" observedRunningTime="2025-09-30 13:57:53.782525684 +0000 UTC m=+1124.166527985" watchObservedRunningTime="2025-09-30 13:57:53.787410787 +0000 UTC m=+1124.171413088" Sep 30 13:57:53 crc kubenswrapper[4936]: I0930 13:57:53.805088 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-q4sjx" podStartSLOduration=2.944268593 podStartE2EDuration="58.80507057s" podCreationTimestamp="2025-09-30 13:56:55 +0000 UTC" firstStartedPulling="2025-09-30 13:56:56.991863938 +0000 UTC m=+1067.375866229" lastFinishedPulling="2025-09-30 13:57:52.852665915 +0000 UTC m=+1123.236668206" observedRunningTime="2025-09-30 13:57:53.797469172 +0000 UTC m=+1124.181471503" watchObservedRunningTime="2025-09-30 13:57:53.80507057 +0000 UTC m=+1124.189072891" Sep 30 13:57:54 crc kubenswrapper[4936]: I0930 13:57:54.743012 4936 generic.go:334] "Generic (PLEG): container finished" podID="ba837426-6b1f-4298-899e-44c286d74708" containerID="456a845ee1c0aceb7e497fbfcde966cc9401cee9cd99800d394a80ca766b4088" exitCode=0 Sep 30 13:57:54 crc kubenswrapper[4936]: I0930 13:57:54.743434 4936 generic.go:334] "Generic (PLEG): container finished" podID="ba837426-6b1f-4298-899e-44c286d74708" containerID="c4d5a5a03569aa07b9f104786803c05b629039fc4598e9437b203ba2266cdff6" exitCode=2 Sep 30 13:57:54 crc kubenswrapper[4936]: I0930 13:57:54.743078 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba837426-6b1f-4298-899e-44c286d74708","Type":"ContainerDied","Data":"456a845ee1c0aceb7e497fbfcde966cc9401cee9cd99800d394a80ca766b4088"} Sep 30 13:57:54 crc kubenswrapper[4936]: I0930 13:57:54.743483 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba837426-6b1f-4298-899e-44c286d74708","Type":"ContainerDied","Data":"c4d5a5a03569aa07b9f104786803c05b629039fc4598e9437b203ba2266cdff6"} Sep 30 13:57:54 crc kubenswrapper[4936]: I0930 13:57:54.743498 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba837426-6b1f-4298-899e-44c286d74708","Type":"ContainerDied","Data":"37ba9804ac456bbddf58ef86b0306ed5b782283f0af76b031081b8742748470a"} Sep 30 13:57:54 crc kubenswrapper[4936]: I0930 13:57:54.743448 4936 generic.go:334] "Generic (PLEG): container finished" podID="ba837426-6b1f-4298-899e-44c286d74708" containerID="37ba9804ac456bbddf58ef86b0306ed5b782283f0af76b031081b8742748470a" exitCode=0 Sep 30 13:57:55 crc kubenswrapper[4936]: I0930 13:57:55.051620 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zrdqm" Sep 30 13:57:55 crc kubenswrapper[4936]: I0930 13:57:55.191172 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e0e7bf-c7d6-4817-a3bc-77189570dfe6-combined-ca-bundle\") pod \"88e0e7bf-c7d6-4817-a3bc-77189570dfe6\" (UID: \"88e0e7bf-c7d6-4817-a3bc-77189570dfe6\") " Sep 30 13:57:55 crc kubenswrapper[4936]: I0930 13:57:55.191405 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/88e0e7bf-c7d6-4817-a3bc-77189570dfe6-config\") pod \"88e0e7bf-c7d6-4817-a3bc-77189570dfe6\" (UID: \"88e0e7bf-c7d6-4817-a3bc-77189570dfe6\") " Sep 30 13:57:55 crc kubenswrapper[4936]: I0930 13:57:55.191603 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5snt\" (UniqueName: \"kubernetes.io/projected/88e0e7bf-c7d6-4817-a3bc-77189570dfe6-kube-api-access-d5snt\") pod \"88e0e7bf-c7d6-4817-a3bc-77189570dfe6\" (UID: \"88e0e7bf-c7d6-4817-a3bc-77189570dfe6\") " Sep 30 13:57:55 crc kubenswrapper[4936]: I0930 13:57:55.217949 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e0e7bf-c7d6-4817-a3bc-77189570dfe6-kube-api-access-d5snt" (OuterVolumeSpecName: "kube-api-access-d5snt") pod "88e0e7bf-c7d6-4817-a3bc-77189570dfe6" (UID: "88e0e7bf-c7d6-4817-a3bc-77189570dfe6"). InnerVolumeSpecName "kube-api-access-d5snt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:57:55 crc kubenswrapper[4936]: I0930 13:57:55.222689 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e0e7bf-c7d6-4817-a3bc-77189570dfe6-config" (OuterVolumeSpecName: "config") pod "88e0e7bf-c7d6-4817-a3bc-77189570dfe6" (UID: "88e0e7bf-c7d6-4817-a3bc-77189570dfe6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:55 crc kubenswrapper[4936]: I0930 13:57:55.222775 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e0e7bf-c7d6-4817-a3bc-77189570dfe6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88e0e7bf-c7d6-4817-a3bc-77189570dfe6" (UID: "88e0e7bf-c7d6-4817-a3bc-77189570dfe6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:55 crc kubenswrapper[4936]: I0930 13:57:55.293902 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e0e7bf-c7d6-4817-a3bc-77189570dfe6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:55 crc kubenswrapper[4936]: I0930 13:57:55.293938 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/88e0e7bf-c7d6-4817-a3bc-77189570dfe6-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:55 crc kubenswrapper[4936]: I0930 13:57:55.293951 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5snt\" (UniqueName: \"kubernetes.io/projected/88e0e7bf-c7d6-4817-a3bc-77189570dfe6-kube-api-access-d5snt\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:55 crc kubenswrapper[4936]: I0930 13:57:55.752558 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zrdqm" event={"ID":"88e0e7bf-c7d6-4817-a3bc-77189570dfe6","Type":"ContainerDied","Data":"2847ae13895f8b1f9636e63c336ded5a8f4ad05e3dd46335bf0710731d8755ea"} Sep 30 13:57:55 crc kubenswrapper[4936]: I0930 13:57:55.752602 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2847ae13895f8b1f9636e63c336ded5a8f4ad05e3dd46335bf0710731d8755ea" Sep 30 13:57:55 crc kubenswrapper[4936]: I0930 13:57:55.753639 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zrdqm" Sep 30 13:57:55 crc kubenswrapper[4936]: I0930 13:57:55.936238 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-brfsg"] Sep 30 13:57:55 crc kubenswrapper[4936]: E0930 13:57:55.936921 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e0e7bf-c7d6-4817-a3bc-77189570dfe6" containerName="neutron-db-sync" Sep 30 13:57:55 crc kubenswrapper[4936]: I0930 13:57:55.936937 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e0e7bf-c7d6-4817-a3bc-77189570dfe6" containerName="neutron-db-sync" Sep 30 13:57:55 crc kubenswrapper[4936]: I0930 13:57:55.937114 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="88e0e7bf-c7d6-4817-a3bc-77189570dfe6" containerName="neutron-db-sync" Sep 30 13:57:55 crc kubenswrapper[4936]: I0930 13:57:55.938188 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-brfsg" Sep 30 13:57:55 crc kubenswrapper[4936]: I0930 13:57:55.971712 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-brfsg"] Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.004293 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-brfsg\" (UID: \"377127b0-c46a-431d-8e2c-f902047dd4b3\") " pod="openstack/dnsmasq-dns-7b946d459c-brfsg" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.004394 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvm5z\" (UniqueName: \"kubernetes.io/projected/377127b0-c46a-431d-8e2c-f902047dd4b3-kube-api-access-jvm5z\") pod \"dnsmasq-dns-7b946d459c-brfsg\" (UID: \"377127b0-c46a-431d-8e2c-f902047dd4b3\") " pod="openstack/dnsmasq-dns-7b946d459c-brfsg" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.004417 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-config\") pod \"dnsmasq-dns-7b946d459c-brfsg\" (UID: \"377127b0-c46a-431d-8e2c-f902047dd4b3\") " pod="openstack/dnsmasq-dns-7b946d459c-brfsg" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.004480 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-brfsg\" (UID: \"377127b0-c46a-431d-8e2c-f902047dd4b3\") " pod="openstack/dnsmasq-dns-7b946d459c-brfsg" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.004655 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-dns-svc\") pod \"dnsmasq-dns-7b946d459c-brfsg\" (UID: \"377127b0-c46a-431d-8e2c-f902047dd4b3\") " pod="openstack/dnsmasq-dns-7b946d459c-brfsg" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.105213 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7dc6569b9d-8n4lf"] Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.106265 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-brfsg\" (UID: \"377127b0-c46a-431d-8e2c-f902047dd4b3\") " pod="openstack/dnsmasq-dns-7b946d459c-brfsg" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.106410 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvm5z\" (UniqueName: \"kubernetes.io/projected/377127b0-c46a-431d-8e2c-f902047dd4b3-kube-api-access-jvm5z\") pod \"dnsmasq-dns-7b946d459c-brfsg\" (UID: \"377127b0-c46a-431d-8e2c-f902047dd4b3\") " pod="openstack/dnsmasq-dns-7b946d459c-brfsg" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.106446 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-config\") pod \"dnsmasq-dns-7b946d459c-brfsg\" (UID: \"377127b0-c46a-431d-8e2c-f902047dd4b3\") " pod="openstack/dnsmasq-dns-7b946d459c-brfsg" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.106502 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-brfsg\" (UID: \"377127b0-c46a-431d-8e2c-f902047dd4b3\") " pod="openstack/dnsmasq-dns-7b946d459c-brfsg" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.106559 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.106563 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-dns-svc\") pod \"dnsmasq-dns-7b946d459c-brfsg\" (UID: \"377127b0-c46a-431d-8e2c-f902047dd4b3\") " pod="openstack/dnsmasq-dns-7b946d459c-brfsg" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.107907 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-dns-svc\") pod \"dnsmasq-dns-7b946d459c-brfsg\" (UID: \"377127b0-c46a-431d-8e2c-f902047dd4b3\") " pod="openstack/dnsmasq-dns-7b946d459c-brfsg" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.107924 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-config\") pod \"dnsmasq-dns-7b946d459c-brfsg\" (UID: \"377127b0-c46a-431d-8e2c-f902047dd4b3\") " pod="openstack/dnsmasq-dns-7b946d459c-brfsg" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.107954 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-brfsg\" (UID: \"377127b0-c46a-431d-8e2c-f902047dd4b3\") " pod="openstack/dnsmasq-dns-7b946d459c-brfsg" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.108304 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-brfsg\" (UID: \"377127b0-c46a-431d-8e2c-f902047dd4b3\") " pod="openstack/dnsmasq-dns-7b946d459c-brfsg" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.109288 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ft2gs" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.113738 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.113864 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.114032 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.136817 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7dc6569b9d-8n4lf"] Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.148795 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvm5z\" (UniqueName: \"kubernetes.io/projected/377127b0-c46a-431d-8e2c-f902047dd4b3-kube-api-access-jvm5z\") pod \"dnsmasq-dns-7b946d459c-brfsg\" (UID: \"377127b0-c46a-431d-8e2c-f902047dd4b3\") " pod="openstack/dnsmasq-dns-7b946d459c-brfsg" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.208729 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s7b9\" (UniqueName: \"kubernetes.io/projected/99fde8c0-65fb-426a-afd1-85f27f8e63ea-kube-api-access-7s7b9\") pod \"neutron-7dc6569b9d-8n4lf\" (UID: \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\") " pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.209288 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-httpd-config\") pod \"neutron-7dc6569b9d-8n4lf\" (UID: \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\") " pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.209536 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-ovndb-tls-certs\") pod \"neutron-7dc6569b9d-8n4lf\" (UID: \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\") " pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.209713 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-config\") pod \"neutron-7dc6569b9d-8n4lf\" (UID: \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\") " pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.209884 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-combined-ca-bundle\") pod \"neutron-7dc6569b9d-8n4lf\" (UID: \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\") " pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.256754 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-brfsg" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.311174 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-config\") pod \"neutron-7dc6569b9d-8n4lf\" (UID: \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\") " pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.311241 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-combined-ca-bundle\") pod \"neutron-7dc6569b9d-8n4lf\" (UID: \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\") " pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.311293 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s7b9\" (UniqueName: \"kubernetes.io/projected/99fde8c0-65fb-426a-afd1-85f27f8e63ea-kube-api-access-7s7b9\") pod \"neutron-7dc6569b9d-8n4lf\" (UID: \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\") " pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.311772 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-httpd-config\") pod \"neutron-7dc6569b9d-8n4lf\" (UID: \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\") " pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.312409 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-ovndb-tls-certs\") pod \"neutron-7dc6569b9d-8n4lf\" (UID: \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\") " pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.321399 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-combined-ca-bundle\") pod \"neutron-7dc6569b9d-8n4lf\" (UID: \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\") " pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.322670 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-config\") pod \"neutron-7dc6569b9d-8n4lf\" (UID: \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\") " pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.329434 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-ovndb-tls-certs\") pod \"neutron-7dc6569b9d-8n4lf\" (UID: \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\") " pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.329796 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-httpd-config\") pod \"neutron-7dc6569b9d-8n4lf\" (UID: \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\") " pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.347448 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s7b9\" (UniqueName: \"kubernetes.io/projected/99fde8c0-65fb-426a-afd1-85f27f8e63ea-kube-api-access-7s7b9\") pod \"neutron-7dc6569b9d-8n4lf\" (UID: \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\") " pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.430403 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.764869 4936 generic.go:334] "Generic (PLEG): container finished" podID="f4b0ae70-9c0c-48af-8ad9-a226c9798c4a" containerID="ac7828e1177620cec3c03e47abca450d452afe663f4eff809ef7c11404f88829" exitCode=0 Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.765201 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q4sjx" event={"ID":"f4b0ae70-9c0c-48af-8ad9-a226c9798c4a","Type":"ContainerDied","Data":"ac7828e1177620cec3c03e47abca450d452afe663f4eff809ef7c11404f88829"} Sep 30 13:57:56 crc kubenswrapper[4936]: I0930 13:57:56.808108 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-brfsg"] Sep 30 13:57:57 crc kubenswrapper[4936]: I0930 13:57:57.000015 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7dc6569b9d-8n4lf"] Sep 30 13:57:57 crc kubenswrapper[4936]: I0930 13:57:57.774015 4936 generic.go:334] "Generic (PLEG): container finished" podID="377127b0-c46a-431d-8e2c-f902047dd4b3" containerID="e82bd5103bf242d00b3e8adf069c3daae2ff638465741cc7d3600bfa03c56ed1" exitCode=0 Sep 30 13:57:57 crc kubenswrapper[4936]: I0930 13:57:57.774196 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-brfsg" event={"ID":"377127b0-c46a-431d-8e2c-f902047dd4b3","Type":"ContainerDied","Data":"e82bd5103bf242d00b3e8adf069c3daae2ff638465741cc7d3600bfa03c56ed1"} Sep 30 13:57:57 crc kubenswrapper[4936]: I0930 13:57:57.774401 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-brfsg" event={"ID":"377127b0-c46a-431d-8e2c-f902047dd4b3","Type":"ContainerStarted","Data":"69acfcef5a4a9ce293ec51a5f5a2ac0e0bf33458bde05c621aa3f00dd5e45c01"} Sep 30 13:57:57 crc kubenswrapper[4936]: I0930 13:57:57.777988 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dc6569b9d-8n4lf" event={"ID":"99fde8c0-65fb-426a-afd1-85f27f8e63ea","Type":"ContainerStarted","Data":"73eb5ade2febe4faf83d4100fbbbad33237364a7f685850e870edceb1cf6a5f3"} Sep 30 13:57:57 crc kubenswrapper[4936]: I0930 13:57:57.778022 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dc6569b9d-8n4lf" event={"ID":"99fde8c0-65fb-426a-afd1-85f27f8e63ea","Type":"ContainerStarted","Data":"57c1df51b386ffdad128fd5e625d06654ff54c7a1a075f96efc024b1350f3ae9"} Sep 30 13:57:57 crc kubenswrapper[4936]: I0930 13:57:57.778032 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dc6569b9d-8n4lf" event={"ID":"99fde8c0-65fb-426a-afd1-85f27f8e63ea","Type":"ContainerStarted","Data":"760b0db3966ede9fd9846efd55b2bbfb35a6f8e0f959d3a8571854c0ccd77cf5"} Sep 30 13:57:57 crc kubenswrapper[4936]: I0930 13:57:57.778062 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:57:57 crc kubenswrapper[4936]: I0930 13:57:57.876497 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7dc6569b9d-8n4lf" podStartSLOduration=1.8764762990000001 podStartE2EDuration="1.876476299s" podCreationTimestamp="2025-09-30 13:57:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:57:57.872225593 +0000 UTC m=+1128.256227894" watchObservedRunningTime="2025-09-30 13:57:57.876476299 +0000 UTC m=+1128.260478610" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.288938 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q4sjx" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.457624 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8llsl\" (UniqueName: \"kubernetes.io/projected/f4b0ae70-9c0c-48af-8ad9-a226c9798c4a-kube-api-access-8llsl\") pod \"f4b0ae70-9c0c-48af-8ad9-a226c9798c4a\" (UID: \"f4b0ae70-9c0c-48af-8ad9-a226c9798c4a\") " Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.457707 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b0ae70-9c0c-48af-8ad9-a226c9798c4a-combined-ca-bundle\") pod \"f4b0ae70-9c0c-48af-8ad9-a226c9798c4a\" (UID: \"f4b0ae70-9c0c-48af-8ad9-a226c9798c4a\") " Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.457824 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4b0ae70-9c0c-48af-8ad9-a226c9798c4a-db-sync-config-data\") pod \"f4b0ae70-9c0c-48af-8ad9-a226c9798c4a\" (UID: \"f4b0ae70-9c0c-48af-8ad9-a226c9798c4a\") " Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.464750 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4b0ae70-9c0c-48af-8ad9-a226c9798c4a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f4b0ae70-9c0c-48af-8ad9-a226c9798c4a" (UID: "f4b0ae70-9c0c-48af-8ad9-a226c9798c4a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.491610 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4b0ae70-9c0c-48af-8ad9-a226c9798c4a-kube-api-access-8llsl" (OuterVolumeSpecName: "kube-api-access-8llsl") pod "f4b0ae70-9c0c-48af-8ad9-a226c9798c4a" (UID: "f4b0ae70-9c0c-48af-8ad9-a226c9798c4a"). InnerVolumeSpecName "kube-api-access-8llsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.509906 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4b0ae70-9c0c-48af-8ad9-a226c9798c4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4b0ae70-9c0c-48af-8ad9-a226c9798c4a" (UID: "f4b0ae70-9c0c-48af-8ad9-a226c9798c4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.559292 4936 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4b0ae70-9c0c-48af-8ad9-a226c9798c4a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.559322 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8llsl\" (UniqueName: \"kubernetes.io/projected/f4b0ae70-9c0c-48af-8ad9-a226c9798c4a-kube-api-access-8llsl\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.559388 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b0ae70-9c0c-48af-8ad9-a226c9798c4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.625142 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-78b5b9486f-frfk9"] Sep 30 13:57:58 crc kubenswrapper[4936]: E0930 13:57:58.625530 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b0ae70-9c0c-48af-8ad9-a226c9798c4a" containerName="barbican-db-sync" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.625541 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b0ae70-9c0c-48af-8ad9-a226c9798c4a" containerName="barbican-db-sync" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.625754 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b0ae70-9c0c-48af-8ad9-a226c9798c4a" containerName="barbican-db-sync" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.626666 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.632897 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.632919 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.642307 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78b5b9486f-frfk9"] Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.672714 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.762174 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-combined-ca-bundle\") pod \"ba837426-6b1f-4298-899e-44c286d74708\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.762398 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba837426-6b1f-4298-899e-44c286d74708-log-httpd\") pod \"ba837426-6b1f-4298-899e-44c286d74708\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.762429 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-config-data\") pod \"ba837426-6b1f-4298-899e-44c286d74708\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.762452 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-sg-core-conf-yaml\") pod \"ba837426-6b1f-4298-899e-44c286d74708\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.762475 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba837426-6b1f-4298-899e-44c286d74708-run-httpd\") pod \"ba837426-6b1f-4298-899e-44c286d74708\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.762501 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5zhk\" (UniqueName: \"kubernetes.io/projected/ba837426-6b1f-4298-899e-44c286d74708-kube-api-access-w5zhk\") pod \"ba837426-6b1f-4298-899e-44c286d74708\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.762527 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-scripts\") pod \"ba837426-6b1f-4298-899e-44c286d74708\" (UID: \"ba837426-6b1f-4298-899e-44c286d74708\") " Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.762709 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-public-tls-certs\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.762733 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-combined-ca-bundle\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.762750 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-internal-tls-certs\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.762815 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-httpd-config\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.762832 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-config\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.762827 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba837426-6b1f-4298-899e-44c286d74708-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ba837426-6b1f-4298-899e-44c286d74708" (UID: "ba837426-6b1f-4298-899e-44c286d74708"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.762985 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-ovndb-tls-certs\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.763018 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sxsm\" (UniqueName: \"kubernetes.io/projected/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-kube-api-access-7sxsm\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.763127 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba837426-6b1f-4298-899e-44c286d74708-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ba837426-6b1f-4298-899e-44c286d74708" (UID: "ba837426-6b1f-4298-899e-44c286d74708"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.763203 4936 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba837426-6b1f-4298-899e-44c286d74708-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.779168 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-scripts" (OuterVolumeSpecName: "scripts") pod "ba837426-6b1f-4298-899e-44c286d74708" (UID: "ba837426-6b1f-4298-899e-44c286d74708"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.780934 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba837426-6b1f-4298-899e-44c286d74708-kube-api-access-w5zhk" (OuterVolumeSpecName: "kube-api-access-w5zhk") pod "ba837426-6b1f-4298-899e-44c286d74708" (UID: "ba837426-6b1f-4298-899e-44c286d74708"). InnerVolumeSpecName "kube-api-access-w5zhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.788219 4936 generic.go:334] "Generic (PLEG): container finished" podID="ba837426-6b1f-4298-899e-44c286d74708" containerID="56a7ec7de306be8cf8af9522b1a952ec3b355e6f27ae61c987a24c26e37e8ada" exitCode=0 Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.788376 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba837426-6b1f-4298-899e-44c286d74708","Type":"ContainerDied","Data":"56a7ec7de306be8cf8af9522b1a952ec3b355e6f27ae61c987a24c26e37e8ada"} Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.788410 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba837426-6b1f-4298-899e-44c286d74708","Type":"ContainerDied","Data":"6a7cf218f977e38442a687e972c4ae617923d7961aa237baf9e02cf50b189267"} Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.788433 4936 scope.go:117] "RemoveContainer" containerID="456a845ee1c0aceb7e497fbfcde966cc9401cee9cd99800d394a80ca766b4088" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.788558 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.795704 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ba837426-6b1f-4298-899e-44c286d74708" (UID: "ba837426-6b1f-4298-899e-44c286d74708"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.795967 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q4sjx" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.796200 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q4sjx" event={"ID":"f4b0ae70-9c0c-48af-8ad9-a226c9798c4a","Type":"ContainerDied","Data":"ccef8cfe9330b048e63210cc44c60eb8ef6d7c9b6c8301b077bfbc35aa4cfed5"} Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.796247 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccef8cfe9330b048e63210cc44c60eb8ef6d7c9b6c8301b077bfbc35aa4cfed5" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.802284 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-brfsg" event={"ID":"377127b0-c46a-431d-8e2c-f902047dd4b3","Type":"ContainerStarted","Data":"fb8ff52d6600198d4659ed1694a19c15f549333e8a55ded769264fed44738967"} Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.802371 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b946d459c-brfsg" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.822040 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b946d459c-brfsg" podStartSLOduration=3.822023276 podStartE2EDuration="3.822023276s" podCreationTimestamp="2025-09-30 13:57:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:57:58.816260568 +0000 UTC m=+1129.200262859" watchObservedRunningTime="2025-09-30 13:57:58.822023276 +0000 UTC m=+1129.206025577" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.826562 4936 scope.go:117] "RemoveContainer" containerID="c4d5a5a03569aa07b9f104786803c05b629039fc4598e9437b203ba2266cdff6" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.856822 4936 scope.go:117] "RemoveContainer" containerID="56a7ec7de306be8cf8af9522b1a952ec3b355e6f27ae61c987a24c26e37e8ada" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.865419 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-httpd-config\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.865480 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-config\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.865547 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-ovndb-tls-certs\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.865578 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sxsm\" (UniqueName: \"kubernetes.io/projected/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-kube-api-access-7sxsm\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.865642 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-public-tls-certs\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.865666 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-combined-ca-bundle\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.865689 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-internal-tls-certs\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.865773 4936 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.865793 4936 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba837426-6b1f-4298-899e-44c286d74708-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.865805 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5zhk\" (UniqueName: \"kubernetes.io/projected/ba837426-6b1f-4298-899e-44c286d74708-kube-api-access-w5zhk\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.865818 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.869584 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-internal-tls-certs\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.874084 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-httpd-config\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.874621 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-public-tls-certs\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.874848 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-combined-ca-bundle\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.875102 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-config\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.878399 4936 scope.go:117] "RemoveContainer" containerID="37ba9804ac456bbddf58ef86b0306ed5b782283f0af76b031081b8742748470a" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.889265 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba837426-6b1f-4298-899e-44c286d74708" (UID: "ba837426-6b1f-4298-899e-44c286d74708"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.891133 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-ovndb-tls-certs\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.895057 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sxsm\" (UniqueName: \"kubernetes.io/projected/9aa1f4b8-b399-4cf0-8d95-12a1eca674a7-kube-api-access-7sxsm\") pod \"neutron-78b5b9486f-frfk9\" (UID: \"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7\") " pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.928547 4936 scope.go:117] "RemoveContainer" containerID="456a845ee1c0aceb7e497fbfcde966cc9401cee9cd99800d394a80ca766b4088" Sep 30 13:57:58 crc kubenswrapper[4936]: E0930 13:57:58.938861 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456a845ee1c0aceb7e497fbfcde966cc9401cee9cd99800d394a80ca766b4088\": container with ID starting with 456a845ee1c0aceb7e497fbfcde966cc9401cee9cd99800d394a80ca766b4088 not found: ID does not exist" containerID="456a845ee1c0aceb7e497fbfcde966cc9401cee9cd99800d394a80ca766b4088" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.938908 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456a845ee1c0aceb7e497fbfcde966cc9401cee9cd99800d394a80ca766b4088"} err="failed to get container status \"456a845ee1c0aceb7e497fbfcde966cc9401cee9cd99800d394a80ca766b4088\": rpc error: code = NotFound desc = could not find container \"456a845ee1c0aceb7e497fbfcde966cc9401cee9cd99800d394a80ca766b4088\": container with ID starting with 456a845ee1c0aceb7e497fbfcde966cc9401cee9cd99800d394a80ca766b4088 not found: ID does not exist" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.938933 4936 scope.go:117] "RemoveContainer" containerID="c4d5a5a03569aa07b9f104786803c05b629039fc4598e9437b203ba2266cdff6" Sep 30 13:57:58 crc kubenswrapper[4936]: E0930 13:57:58.939587 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4d5a5a03569aa07b9f104786803c05b629039fc4598e9437b203ba2266cdff6\": container with ID starting with c4d5a5a03569aa07b9f104786803c05b629039fc4598e9437b203ba2266cdff6 not found: ID does not exist" containerID="c4d5a5a03569aa07b9f104786803c05b629039fc4598e9437b203ba2266cdff6" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.939622 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4d5a5a03569aa07b9f104786803c05b629039fc4598e9437b203ba2266cdff6"} err="failed to get container status \"c4d5a5a03569aa07b9f104786803c05b629039fc4598e9437b203ba2266cdff6\": rpc error: code = NotFound desc = could not find container \"c4d5a5a03569aa07b9f104786803c05b629039fc4598e9437b203ba2266cdff6\": container with ID starting with c4d5a5a03569aa07b9f104786803c05b629039fc4598e9437b203ba2266cdff6 not found: ID does not exist" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.939635 4936 scope.go:117] "RemoveContainer" containerID="56a7ec7de306be8cf8af9522b1a952ec3b355e6f27ae61c987a24c26e37e8ada" Sep 30 13:57:58 crc kubenswrapper[4936]: E0930 13:57:58.939943 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56a7ec7de306be8cf8af9522b1a952ec3b355e6f27ae61c987a24c26e37e8ada\": container with ID starting with 56a7ec7de306be8cf8af9522b1a952ec3b355e6f27ae61c987a24c26e37e8ada not found: ID does not exist" containerID="56a7ec7de306be8cf8af9522b1a952ec3b355e6f27ae61c987a24c26e37e8ada" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.939963 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56a7ec7de306be8cf8af9522b1a952ec3b355e6f27ae61c987a24c26e37e8ada"} err="failed to get container status \"56a7ec7de306be8cf8af9522b1a952ec3b355e6f27ae61c987a24c26e37e8ada\": rpc error: code = NotFound desc = could not find container \"56a7ec7de306be8cf8af9522b1a952ec3b355e6f27ae61c987a24c26e37e8ada\": container with ID starting with 56a7ec7de306be8cf8af9522b1a952ec3b355e6f27ae61c987a24c26e37e8ada not found: ID does not exist" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.939975 4936 scope.go:117] "RemoveContainer" containerID="37ba9804ac456bbddf58ef86b0306ed5b782283f0af76b031081b8742748470a" Sep 30 13:57:58 crc kubenswrapper[4936]: E0930 13:57:58.940236 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37ba9804ac456bbddf58ef86b0306ed5b782283f0af76b031081b8742748470a\": container with ID starting with 37ba9804ac456bbddf58ef86b0306ed5b782283f0af76b031081b8742748470a not found: ID does not exist" containerID="37ba9804ac456bbddf58ef86b0306ed5b782283f0af76b031081b8742748470a" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.940268 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37ba9804ac456bbddf58ef86b0306ed5b782283f0af76b031081b8742748470a"} err="failed to get container status \"37ba9804ac456bbddf58ef86b0306ed5b782283f0af76b031081b8742748470a\": rpc error: code = NotFound desc = could not find container \"37ba9804ac456bbddf58ef86b0306ed5b782283f0af76b031081b8742748470a\": container with ID starting with 37ba9804ac456bbddf58ef86b0306ed5b782283f0af76b031081b8742748470a not found: ID does not exist" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.956786 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-config-data" (OuterVolumeSpecName: "config-data") pod "ba837426-6b1f-4298-899e-44c286d74708" (UID: "ba837426-6b1f-4298-899e-44c286d74708"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.959302 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.967828 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:58 crc kubenswrapper[4936]: I0930 13:57:58.967857 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba837426-6b1f-4298-899e-44c286d74708-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.023249 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-cf994c9f9-p76xq"] Sep 30 13:57:59 crc kubenswrapper[4936]: E0930 13:57:59.023642 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba837426-6b1f-4298-899e-44c286d74708" containerName="proxy-httpd" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.023665 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba837426-6b1f-4298-899e-44c286d74708" containerName="proxy-httpd" Sep 30 13:57:59 crc kubenswrapper[4936]: E0930 13:57:59.023691 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba837426-6b1f-4298-899e-44c286d74708" containerName="sg-core" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.023698 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba837426-6b1f-4298-899e-44c286d74708" containerName="sg-core" Sep 30 13:57:59 crc kubenswrapper[4936]: E0930 13:57:59.023706 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba837426-6b1f-4298-899e-44c286d74708" containerName="ceilometer-notification-agent" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.023712 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba837426-6b1f-4298-899e-44c286d74708" containerName="ceilometer-notification-agent" Sep 30 13:57:59 crc kubenswrapper[4936]: E0930 13:57:59.023729 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba837426-6b1f-4298-899e-44c286d74708" containerName="ceilometer-central-agent" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.023734 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba837426-6b1f-4298-899e-44c286d74708" containerName="ceilometer-central-agent" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.023888 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba837426-6b1f-4298-899e-44c286d74708" containerName="ceilometer-central-agent" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.023916 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba837426-6b1f-4298-899e-44c286d74708" containerName="proxy-httpd" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.023923 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba837426-6b1f-4298-899e-44c286d74708" containerName="ceilometer-notification-agent" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.023934 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba837426-6b1f-4298-899e-44c286d74708" containerName="sg-core" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.027426 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-cf994c9f9-p76xq" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.034115 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.034519 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-x72m7" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.034907 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.079725 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-cf994c9f9-p76xq"] Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.133432 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-77c7874cdd-k7vml"] Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.134925 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.147015 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-77c7874cdd-k7vml"] Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.158647 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.175246 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c089d3fc-0428-4e46-8796-efa4f3df1fb6-config-data-custom\") pod \"barbican-worker-cf994c9f9-p76xq\" (UID: \"c089d3fc-0428-4e46-8796-efa4f3df1fb6\") " pod="openstack/barbican-worker-cf994c9f9-p76xq" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.175301 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c089d3fc-0428-4e46-8796-efa4f3df1fb6-logs\") pod \"barbican-worker-cf994c9f9-p76xq\" (UID: \"c089d3fc-0428-4e46-8796-efa4f3df1fb6\") " pod="openstack/barbican-worker-cf994c9f9-p76xq" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.175326 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c089d3fc-0428-4e46-8796-efa4f3df1fb6-combined-ca-bundle\") pod \"barbican-worker-cf994c9f9-p76xq\" (UID: \"c089d3fc-0428-4e46-8796-efa4f3df1fb6\") " pod="openstack/barbican-worker-cf994c9f9-p76xq" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.184013 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg4xp\" (UniqueName: \"kubernetes.io/projected/c089d3fc-0428-4e46-8796-efa4f3df1fb6-kube-api-access-xg4xp\") pod \"barbican-worker-cf994c9f9-p76xq\" (UID: \"c089d3fc-0428-4e46-8796-efa4f3df1fb6\") " pod="openstack/barbican-worker-cf994c9f9-p76xq" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.184265 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c089d3fc-0428-4e46-8796-efa4f3df1fb6-config-data\") pod \"barbican-worker-cf994c9f9-p76xq\" (UID: \"c089d3fc-0428-4e46-8796-efa4f3df1fb6\") " pod="openstack/barbican-worker-cf994c9f9-p76xq" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.213954 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-brfsg"] Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.226362 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.239792 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.256549 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.259158 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.273152 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.273264 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.279013 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-nrzqt"] Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.281885 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.285599 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5ee5ab-2208-44b0-a464-f813f6314c26-combined-ca-bundle\") pod \"barbican-keystone-listener-77c7874cdd-k7vml\" (UID: \"6b5ee5ab-2208-44b0-a464-f813f6314c26\") " pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.285635 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b5ee5ab-2208-44b0-a464-f813f6314c26-logs\") pod \"barbican-keystone-listener-77c7874cdd-k7vml\" (UID: \"6b5ee5ab-2208-44b0-a464-f813f6314c26\") " pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.285662 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c089d3fc-0428-4e46-8796-efa4f3df1fb6-config-data-custom\") pod \"barbican-worker-cf994c9f9-p76xq\" (UID: \"c089d3fc-0428-4e46-8796-efa4f3df1fb6\") " pod="openstack/barbican-worker-cf994c9f9-p76xq" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.285689 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c089d3fc-0428-4e46-8796-efa4f3df1fb6-logs\") pod \"barbican-worker-cf994c9f9-p76xq\" (UID: \"c089d3fc-0428-4e46-8796-efa4f3df1fb6\") " pod="openstack/barbican-worker-cf994c9f9-p76xq" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.285710 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c089d3fc-0428-4e46-8796-efa4f3df1fb6-combined-ca-bundle\") pod \"barbican-worker-cf994c9f9-p76xq\" (UID: \"c089d3fc-0428-4e46-8796-efa4f3df1fb6\") " pod="openstack/barbican-worker-cf994c9f9-p76xq" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.285726 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg4xp\" (UniqueName: \"kubernetes.io/projected/c089d3fc-0428-4e46-8796-efa4f3df1fb6-kube-api-access-xg4xp\") pod \"barbican-worker-cf994c9f9-p76xq\" (UID: \"c089d3fc-0428-4e46-8796-efa4f3df1fb6\") " pod="openstack/barbican-worker-cf994c9f9-p76xq" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.285774 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c089d3fc-0428-4e46-8796-efa4f3df1fb6-config-data\") pod \"barbican-worker-cf994c9f9-p76xq\" (UID: \"c089d3fc-0428-4e46-8796-efa4f3df1fb6\") " pod="openstack/barbican-worker-cf994c9f9-p76xq" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.285803 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4rqb\" (UniqueName: \"kubernetes.io/projected/6b5ee5ab-2208-44b0-a464-f813f6314c26-kube-api-access-j4rqb\") pod \"barbican-keystone-listener-77c7874cdd-k7vml\" (UID: \"6b5ee5ab-2208-44b0-a464-f813f6314c26\") " pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.285848 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b5ee5ab-2208-44b0-a464-f813f6314c26-config-data-custom\") pod \"barbican-keystone-listener-77c7874cdd-k7vml\" (UID: \"6b5ee5ab-2208-44b0-a464-f813f6314c26\") " pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.285884 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5ee5ab-2208-44b0-a464-f813f6314c26-config-data\") pod \"barbican-keystone-listener-77c7874cdd-k7vml\" (UID: \"6b5ee5ab-2208-44b0-a464-f813f6314c26\") " pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.287069 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c089d3fc-0428-4e46-8796-efa4f3df1fb6-logs\") pod \"barbican-worker-cf994c9f9-p76xq\" (UID: \"c089d3fc-0428-4e46-8796-efa4f3df1fb6\") " pod="openstack/barbican-worker-cf994c9f9-p76xq" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.293601 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c089d3fc-0428-4e46-8796-efa4f3df1fb6-config-data-custom\") pod \"barbican-worker-cf994c9f9-p76xq\" (UID: \"c089d3fc-0428-4e46-8796-efa4f3df1fb6\") " pod="openstack/barbican-worker-cf994c9f9-p76xq" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.294901 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c089d3fc-0428-4e46-8796-efa4f3df1fb6-combined-ca-bundle\") pod \"barbican-worker-cf994c9f9-p76xq\" (UID: \"c089d3fc-0428-4e46-8796-efa4f3df1fb6\") " pod="openstack/barbican-worker-cf994c9f9-p76xq" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.302595 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c089d3fc-0428-4e46-8796-efa4f3df1fb6-config-data\") pod \"barbican-worker-cf994c9f9-p76xq\" (UID: \"c089d3fc-0428-4e46-8796-efa4f3df1fb6\") " pod="openstack/barbican-worker-cf994c9f9-p76xq" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.309937 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.326007 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-nrzqt"] Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.369504 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg4xp\" (UniqueName: \"kubernetes.io/projected/c089d3fc-0428-4e46-8796-efa4f3df1fb6-kube-api-access-xg4xp\") pod \"barbican-worker-cf994c9f9-p76xq\" (UID: \"c089d3fc-0428-4e46-8796-efa4f3df1fb6\") " pod="openstack/barbican-worker-cf994c9f9-p76xq" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.411417 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-nrzqt\" (UID: \"b40c969f-6127-4b8c-ab4a-0219e01329a4\") " pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.411722 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-config-data\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.411744 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.411774 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4rqb\" (UniqueName: \"kubernetes.io/projected/6b5ee5ab-2208-44b0-a464-f813f6314c26-kube-api-access-j4rqb\") pod \"barbican-keystone-listener-77c7874cdd-k7vml\" (UID: \"6b5ee5ab-2208-44b0-a464-f813f6314c26\") " pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.411806 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-scripts\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.411859 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.411895 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b5ee5ab-2208-44b0-a464-f813f6314c26-config-data-custom\") pod \"barbican-keystone-listener-77c7874cdd-k7vml\" (UID: \"6b5ee5ab-2208-44b0-a464-f813f6314c26\") " pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.411933 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6dzq\" (UniqueName: \"kubernetes.io/projected/b40c969f-6127-4b8c-ab4a-0219e01329a4-kube-api-access-t6dzq\") pod \"dnsmasq-dns-6bb684768f-nrzqt\" (UID: \"b40c969f-6127-4b8c-ab4a-0219e01329a4\") " pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.411958 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5ee5ab-2208-44b0-a464-f813f6314c26-config-data\") pod \"barbican-keystone-listener-77c7874cdd-k7vml\" (UID: \"6b5ee5ab-2208-44b0-a464-f813f6314c26\") " pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.411987 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/646cad35-c602-4015-b52f-9eca7cee1b80-run-httpd\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.412010 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-dns-svc\") pod \"dnsmasq-dns-6bb684768f-nrzqt\" (UID: \"b40c969f-6127-4b8c-ab4a-0219e01329a4\") " pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.412049 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5ee5ab-2208-44b0-a464-f813f6314c26-combined-ca-bundle\") pod \"barbican-keystone-listener-77c7874cdd-k7vml\" (UID: \"6b5ee5ab-2208-44b0-a464-f813f6314c26\") " pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.412070 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b5ee5ab-2208-44b0-a464-f813f6314c26-logs\") pod \"barbican-keystone-listener-77c7874cdd-k7vml\" (UID: \"6b5ee5ab-2208-44b0-a464-f813f6314c26\") " pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.412127 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqb46\" (UniqueName: \"kubernetes.io/projected/646cad35-c602-4015-b52f-9eca7cee1b80-kube-api-access-kqb46\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.412173 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/646cad35-c602-4015-b52f-9eca7cee1b80-log-httpd\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.412192 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-nrzqt\" (UID: \"b40c969f-6127-4b8c-ab4a-0219e01329a4\") " pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.412217 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-config\") pod \"dnsmasq-dns-6bb684768f-nrzqt\" (UID: \"b40c969f-6127-4b8c-ab4a-0219e01329a4\") " pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.417792 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b5ee5ab-2208-44b0-a464-f813f6314c26-logs\") pod \"barbican-keystone-listener-77c7874cdd-k7vml\" (UID: \"6b5ee5ab-2208-44b0-a464-f813f6314c26\") " pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.469442 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b5ee5ab-2208-44b0-a464-f813f6314c26-config-data-custom\") pod \"barbican-keystone-listener-77c7874cdd-k7vml\" (UID: \"6b5ee5ab-2208-44b0-a464-f813f6314c26\") " pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.474055 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b5ee5ab-2208-44b0-a464-f813f6314c26-combined-ca-bundle\") pod \"barbican-keystone-listener-77c7874cdd-k7vml\" (UID: \"6b5ee5ab-2208-44b0-a464-f813f6314c26\") " pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.501215 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b5ee5ab-2208-44b0-a464-f813f6314c26-config-data\") pod \"barbican-keystone-listener-77c7874cdd-k7vml\" (UID: \"6b5ee5ab-2208-44b0-a464-f813f6314c26\") " pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.508054 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4rqb\" (UniqueName: \"kubernetes.io/projected/6b5ee5ab-2208-44b0-a464-f813f6314c26-kube-api-access-j4rqb\") pod \"barbican-keystone-listener-77c7874cdd-k7vml\" (UID: \"6b5ee5ab-2208-44b0-a464-f813f6314c26\") " pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.519499 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-nrzqt\" (UID: \"b40c969f-6127-4b8c-ab4a-0219e01329a4\") " pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.519552 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-config-data\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.519584 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.519621 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-scripts\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.519707 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.519771 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6dzq\" (UniqueName: \"kubernetes.io/projected/b40c969f-6127-4b8c-ab4a-0219e01329a4-kube-api-access-t6dzq\") pod \"dnsmasq-dns-6bb684768f-nrzqt\" (UID: \"b40c969f-6127-4b8c-ab4a-0219e01329a4\") " pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.519810 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/646cad35-c602-4015-b52f-9eca7cee1b80-run-httpd\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.519840 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-dns-svc\") pod \"dnsmasq-dns-6bb684768f-nrzqt\" (UID: \"b40c969f-6127-4b8c-ab4a-0219e01329a4\") " pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.519922 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqb46\" (UniqueName: \"kubernetes.io/projected/646cad35-c602-4015-b52f-9eca7cee1b80-kube-api-access-kqb46\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.519997 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/646cad35-c602-4015-b52f-9eca7cee1b80-log-httpd\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.520021 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-nrzqt\" (UID: \"b40c969f-6127-4b8c-ab4a-0219e01329a4\") " pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.520053 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-config\") pod \"dnsmasq-dns-6bb684768f-nrzqt\" (UID: \"b40c969f-6127-4b8c-ab4a-0219e01329a4\") " pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.529603 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/646cad35-c602-4015-b52f-9eca7cee1b80-run-httpd\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.530292 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-dns-svc\") pod \"dnsmasq-dns-6bb684768f-nrzqt\" (UID: \"b40c969f-6127-4b8c-ab4a-0219e01329a4\") " pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.533290 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-nrzqt\" (UID: \"b40c969f-6127-4b8c-ab4a-0219e01329a4\") " pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.533607 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/646cad35-c602-4015-b52f-9eca7cee1b80-log-httpd\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.534519 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-nrzqt\" (UID: \"b40c969f-6127-4b8c-ab4a-0219e01329a4\") " pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.539598 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-config\") pod \"dnsmasq-dns-6bb684768f-nrzqt\" (UID: \"b40c969f-6127-4b8c-ab4a-0219e01329a4\") " pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.540803 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.555705 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56f56f5fc4-snznt"] Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.560397 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6dzq\" (UniqueName: \"kubernetes.io/projected/b40c969f-6127-4b8c-ab4a-0219e01329a4-kube-api-access-t6dzq\") pod \"dnsmasq-dns-6bb684768f-nrzqt\" (UID: \"b40c969f-6127-4b8c-ab4a-0219e01329a4\") " pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.561793 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.575495 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.582968 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.583755 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-config-data\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.585254 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.586556 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqb46\" (UniqueName: \"kubernetes.io/projected/646cad35-c602-4015-b52f-9eca7cee1b80-kube-api-access-kqb46\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.591494 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-scripts\") pod \"ceilometer-0\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.660241 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-cf994c9f9-p76xq" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.662909 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56f56f5fc4-snznt"] Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.754226 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63566983-fa75-492f-863c-862e222bbb67-config-data-custom\") pod \"barbican-api-56f56f5fc4-snznt\" (UID: \"63566983-fa75-492f-863c-862e222bbb67\") " pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.754288 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63566983-fa75-492f-863c-862e222bbb67-combined-ca-bundle\") pod \"barbican-api-56f56f5fc4-snznt\" (UID: \"63566983-fa75-492f-863c-862e222bbb67\") " pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.754358 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63566983-fa75-492f-863c-862e222bbb67-logs\") pod \"barbican-api-56f56f5fc4-snznt\" (UID: \"63566983-fa75-492f-863c-862e222bbb67\") " pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.754393 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63566983-fa75-492f-863c-862e222bbb67-config-data\") pod \"barbican-api-56f56f5fc4-snznt\" (UID: \"63566983-fa75-492f-863c-862e222bbb67\") " pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.754463 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwb6k\" (UniqueName: \"kubernetes.io/projected/63566983-fa75-492f-863c-862e222bbb67-kube-api-access-gwb6k\") pod \"barbican-api-56f56f5fc4-snznt\" (UID: \"63566983-fa75-492f-863c-862e222bbb67\") " pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.805726 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.855580 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwb6k\" (UniqueName: \"kubernetes.io/projected/63566983-fa75-492f-863c-862e222bbb67-kube-api-access-gwb6k\") pod \"barbican-api-56f56f5fc4-snznt\" (UID: \"63566983-fa75-492f-863c-862e222bbb67\") " pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.855643 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63566983-fa75-492f-863c-862e222bbb67-config-data-custom\") pod \"barbican-api-56f56f5fc4-snznt\" (UID: \"63566983-fa75-492f-863c-862e222bbb67\") " pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.855686 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63566983-fa75-492f-863c-862e222bbb67-combined-ca-bundle\") pod \"barbican-api-56f56f5fc4-snznt\" (UID: \"63566983-fa75-492f-863c-862e222bbb67\") " pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.855725 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63566983-fa75-492f-863c-862e222bbb67-logs\") pod \"barbican-api-56f56f5fc4-snznt\" (UID: \"63566983-fa75-492f-863c-862e222bbb67\") " pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.855748 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63566983-fa75-492f-863c-862e222bbb67-config-data\") pod \"barbican-api-56f56f5fc4-snznt\" (UID: \"63566983-fa75-492f-863c-862e222bbb67\") " pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.856940 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63566983-fa75-492f-863c-862e222bbb67-logs\") pod \"barbican-api-56f56f5fc4-snznt\" (UID: \"63566983-fa75-492f-863c-862e222bbb67\") " pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.862020 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63566983-fa75-492f-863c-862e222bbb67-config-data-custom\") pod \"barbican-api-56f56f5fc4-snznt\" (UID: \"63566983-fa75-492f-863c-862e222bbb67\") " pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.863468 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63566983-fa75-492f-863c-862e222bbb67-config-data\") pod \"barbican-api-56f56f5fc4-snznt\" (UID: \"63566983-fa75-492f-863c-862e222bbb67\") " pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.863606 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63566983-fa75-492f-863c-862e222bbb67-combined-ca-bundle\") pod \"barbican-api-56f56f5fc4-snznt\" (UID: \"63566983-fa75-492f-863c-862e222bbb67\") " pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.886511 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwb6k\" (UniqueName: \"kubernetes.io/projected/63566983-fa75-492f-863c-862e222bbb67-kube-api-access-gwb6k\") pod \"barbican-api-56f56f5fc4-snznt\" (UID: \"63566983-fa75-492f-863c-862e222bbb67\") " pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.890476 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.927090 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:57:59 crc kubenswrapper[4936]: I0930 13:57:59.974432 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78b5b9486f-frfk9"] Sep 30 13:58:00 crc kubenswrapper[4936]: W0930 13:58:00.017481 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9aa1f4b8_b399_4cf0_8d95_12a1eca674a7.slice/crio-c80564e07b5e4686688750871fe3eeaf5ec0f2a8aa5156cc0f481c32af2b55f0 WatchSource:0}: Error finding container c80564e07b5e4686688750871fe3eeaf5ec0f2a8aa5156cc0f481c32af2b55f0: Status 404 returned error can't find the container with id c80564e07b5e4686688750871fe3eeaf5ec0f2a8aa5156cc0f481c32af2b55f0 Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.256853 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-77c7874cdd-k7vml"] Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.361321 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba837426-6b1f-4298-899e-44c286d74708" path="/var/lib/kubelet/pods/ba837426-6b1f-4298-899e-44c286d74708/volumes" Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.362543 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-cf994c9f9-p76xq"] Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.471219 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-nrzqt"] Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.642632 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56f56f5fc4-snznt"] Sep 30 13:58:00 crc kubenswrapper[4936]: W0930 13:58:00.663408 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63566983_fa75_492f_863c_862e222bbb67.slice/crio-224e07695121b6d5f944934db2b9edf17424712fc188b046c3b86650d472722a WatchSource:0}: Error finding container 224e07695121b6d5f944934db2b9edf17424712fc188b046c3b86650d472722a: Status 404 returned error can't find the container with id 224e07695121b6d5f944934db2b9edf17424712fc188b046c3b86650d472722a Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.678905 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.839447 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"646cad35-c602-4015-b52f-9eca7cee1b80","Type":"ContainerStarted","Data":"2e6c6480f56ac613d8780e80e514445db9ebd5eabef4b18bacb620edddf2fe0e"} Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.841969 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" event={"ID":"6b5ee5ab-2208-44b0-a464-f813f6314c26","Type":"ContainerStarted","Data":"46d9fee444954b5ca6f53f4b3d8a04bc719b28ff63e463044c93935c24ce9f15"} Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.854668 4936 generic.go:334] "Generic (PLEG): container finished" podID="c1a55d69-0992-4eb2-a974-f4eedb0bf989" containerID="e0f1eeaedaf8b51b4b84376e35fc277329272693637da3a56a225ba83381af14" exitCode=0 Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.854728 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lgmpr" event={"ID":"c1a55d69-0992-4eb2-a974-f4eedb0bf989","Type":"ContainerDied","Data":"e0f1eeaedaf8b51b4b84376e35fc277329272693637da3a56a225ba83381af14"} Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.862848 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78b5b9486f-frfk9" event={"ID":"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7","Type":"ContainerStarted","Data":"73b2b94e182a655c2753b7a1fb4fd3722fbd6aa77816658ed0c5f795a5740ffa"} Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.863095 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78b5b9486f-frfk9" event={"ID":"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7","Type":"ContainerStarted","Data":"d24bdca3fcb677773304fc900b5eef83acbd2c4400d8fbef4a3d37621856ed94"} Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.863166 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78b5b9486f-frfk9" event={"ID":"9aa1f4b8-b399-4cf0-8d95-12a1eca674a7","Type":"ContainerStarted","Data":"c80564e07b5e4686688750871fe3eeaf5ec0f2a8aa5156cc0f481c32af2b55f0"} Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.864035 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.868257 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-cf994c9f9-p76xq" event={"ID":"c089d3fc-0428-4e46-8796-efa4f3df1fb6","Type":"ContainerStarted","Data":"9d46d5d83b6ddfe0cf2e805507b5c6c836715889cecdaa82b684b91546f2a148"} Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.874024 4936 generic.go:334] "Generic (PLEG): container finished" podID="b40c969f-6127-4b8c-ab4a-0219e01329a4" containerID="729ef6a2bd04a1ccf44013d6560782b7c0d578c8f70f2c9099b2970361c9752c" exitCode=0 Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.874107 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" event={"ID":"b40c969f-6127-4b8c-ab4a-0219e01329a4","Type":"ContainerDied","Data":"729ef6a2bd04a1ccf44013d6560782b7c0d578c8f70f2c9099b2970361c9752c"} Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.874132 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" event={"ID":"b40c969f-6127-4b8c-ab4a-0219e01329a4","Type":"ContainerStarted","Data":"7f28b85c8953a676a47dd03c4752dd19bb09169510a42b9d37b9dd1d66890886"} Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.890536 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56f56f5fc4-snznt" event={"ID":"63566983-fa75-492f-863c-862e222bbb67","Type":"ContainerStarted","Data":"224e07695121b6d5f944934db2b9edf17424712fc188b046c3b86650d472722a"} Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.890818 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b946d459c-brfsg" podUID="377127b0-c46a-431d-8e2c-f902047dd4b3" containerName="dnsmasq-dns" containerID="cri-o://fb8ff52d6600198d4659ed1694a19c15f549333e8a55ded769264fed44738967" gracePeriod=10 Sep 30 13:58:00 crc kubenswrapper[4936]: I0930 13:58:00.906528 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-78b5b9486f-frfk9" podStartSLOduration=2.906511343 podStartE2EDuration="2.906511343s" podCreationTimestamp="2025-09-30 13:57:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:58:00.896181001 +0000 UTC m=+1131.280183302" watchObservedRunningTime="2025-09-30 13:58:00.906511343 +0000 UTC m=+1131.290513644" Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.613479 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-brfsg" Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.718850 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-config\") pod \"377127b0-c46a-431d-8e2c-f902047dd4b3\" (UID: \"377127b0-c46a-431d-8e2c-f902047dd4b3\") " Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.718911 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-ovsdbserver-sb\") pod \"377127b0-c46a-431d-8e2c-f902047dd4b3\" (UID: \"377127b0-c46a-431d-8e2c-f902047dd4b3\") " Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.718941 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvm5z\" (UniqueName: \"kubernetes.io/projected/377127b0-c46a-431d-8e2c-f902047dd4b3-kube-api-access-jvm5z\") pod \"377127b0-c46a-431d-8e2c-f902047dd4b3\" (UID: \"377127b0-c46a-431d-8e2c-f902047dd4b3\") " Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.719063 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-ovsdbserver-nb\") pod \"377127b0-c46a-431d-8e2c-f902047dd4b3\" (UID: \"377127b0-c46a-431d-8e2c-f902047dd4b3\") " Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.719148 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-dns-svc\") pod \"377127b0-c46a-431d-8e2c-f902047dd4b3\" (UID: \"377127b0-c46a-431d-8e2c-f902047dd4b3\") " Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.741579 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/377127b0-c46a-431d-8e2c-f902047dd4b3-kube-api-access-jvm5z" (OuterVolumeSpecName: "kube-api-access-jvm5z") pod "377127b0-c46a-431d-8e2c-f902047dd4b3" (UID: "377127b0-c46a-431d-8e2c-f902047dd4b3"). InnerVolumeSpecName "kube-api-access-jvm5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.812755 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "377127b0-c46a-431d-8e2c-f902047dd4b3" (UID: "377127b0-c46a-431d-8e2c-f902047dd4b3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.822734 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvm5z\" (UniqueName: \"kubernetes.io/projected/377127b0-c46a-431d-8e2c-f902047dd4b3-kube-api-access-jvm5z\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.822761 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.877770 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "377127b0-c46a-431d-8e2c-f902047dd4b3" (UID: "377127b0-c46a-431d-8e2c-f902047dd4b3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.882890 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "377127b0-c46a-431d-8e2c-f902047dd4b3" (UID: "377127b0-c46a-431d-8e2c-f902047dd4b3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.894718 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-config" (OuterVolumeSpecName: "config") pod "377127b0-c46a-431d-8e2c-f902047dd4b3" (UID: "377127b0-c46a-431d-8e2c-f902047dd4b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.937392 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.937649 4936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.937660 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/377127b0-c46a-431d-8e2c-f902047dd4b3-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.957072 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" event={"ID":"b40c969f-6127-4b8c-ab4a-0219e01329a4","Type":"ContainerStarted","Data":"04e5e9242b738df4b426eadc9e35e2b53c51a95c856e6d7decc4673fc7c46c56"} Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.957413 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.967429 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56f56f5fc4-snznt" event={"ID":"63566983-fa75-492f-863c-862e222bbb67","Type":"ContainerStarted","Data":"1f84acfa121cac1d033126aa96a946b6103b7bc8c4f0eff142b379c80fafc53b"} Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.967514 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56f56f5fc4-snznt" event={"ID":"63566983-fa75-492f-863c-862e222bbb67","Type":"ContainerStarted","Data":"427d28070ec8d01eb7af421d3a7d281dafce4866ea0f3dfbabe37b039e9b1054"} Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.967555 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.967648 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.984464 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" podStartSLOduration=2.984440547 podStartE2EDuration="2.984440547s" podCreationTimestamp="2025-09-30 13:57:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:58:01.977858017 +0000 UTC m=+1132.361860308" watchObservedRunningTime="2025-09-30 13:58:01.984440547 +0000 UTC m=+1132.368442848" Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.994028 4936 generic.go:334] "Generic (PLEG): container finished" podID="377127b0-c46a-431d-8e2c-f902047dd4b3" containerID="fb8ff52d6600198d4659ed1694a19c15f549333e8a55ded769264fed44738967" exitCode=0 Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.994083 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-brfsg" Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.994086 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-brfsg" event={"ID":"377127b0-c46a-431d-8e2c-f902047dd4b3","Type":"ContainerDied","Data":"fb8ff52d6600198d4659ed1694a19c15f549333e8a55ded769264fed44738967"} Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.994288 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-brfsg" event={"ID":"377127b0-c46a-431d-8e2c-f902047dd4b3","Type":"ContainerDied","Data":"69acfcef5a4a9ce293ec51a5f5a2ac0e0bf33458bde05c621aa3f00dd5e45c01"} Sep 30 13:58:01 crc kubenswrapper[4936]: I0930 13:58:01.994312 4936 scope.go:117] "RemoveContainer" containerID="fb8ff52d6600198d4659ed1694a19c15f549333e8a55ded769264fed44738967" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.007186 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"646cad35-c602-4015-b52f-9eca7cee1b80","Type":"ContainerStarted","Data":"acbc08ae19a83f60b9265e0da70c020236f78569dd381f65aada8da5f871106a"} Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.062275 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56f56f5fc4-snznt" podStartSLOduration=3.062259423 podStartE2EDuration="3.062259423s" podCreationTimestamp="2025-09-30 13:57:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:58:02.008265558 +0000 UTC m=+1132.392267859" watchObservedRunningTime="2025-09-30 13:58:02.062259423 +0000 UTC m=+1132.446261724" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.068009 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-brfsg"] Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.079021 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-brfsg"] Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.333696 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="377127b0-c46a-431d-8e2c-f902047dd4b3" path="/var/lib/kubelet/pods/377127b0-c46a-431d-8e2c-f902047dd4b3/volumes" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.708347 4936 scope.go:117] "RemoveContainer" containerID="e82bd5103bf242d00b3e8adf069c3daae2ff638465741cc7d3600bfa03c56ed1" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.753681 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-75d6545796-72m9v"] Sep 30 13:58:02 crc kubenswrapper[4936]: E0930 13:58:02.754053 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377127b0-c46a-431d-8e2c-f902047dd4b3" containerName="init" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.754077 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="377127b0-c46a-431d-8e2c-f902047dd4b3" containerName="init" Sep 30 13:58:02 crc kubenswrapper[4936]: E0930 13:58:02.754111 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377127b0-c46a-431d-8e2c-f902047dd4b3" containerName="dnsmasq-dns" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.754119 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="377127b0-c46a-431d-8e2c-f902047dd4b3" containerName="dnsmasq-dns" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.754287 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="377127b0-c46a-431d-8e2c-f902047dd4b3" containerName="dnsmasq-dns" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.761356 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.763575 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.775882 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.786789 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75d6545796-72m9v"] Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.792930 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.873648 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-internal-tls-certs\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.873934 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh2gr\" (UniqueName: \"kubernetes.io/projected/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-kube-api-access-nh2gr\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.873995 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-logs\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.874028 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-combined-ca-bundle\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.874060 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-config-data\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.874098 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-config-data-custom\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.874135 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-public-tls-certs\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.975783 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c1a55d69-0992-4eb2-a974-f4eedb0bf989-etc-machine-id\") pod \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.975834 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-combined-ca-bundle\") pod \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.975926 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-db-sync-config-data\") pod \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.975949 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-scripts\") pod \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.975965 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95xt4\" (UniqueName: \"kubernetes.io/projected/c1a55d69-0992-4eb2-a974-f4eedb0bf989-kube-api-access-95xt4\") pod \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.976006 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-config-data\") pod \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\" (UID: \"c1a55d69-0992-4eb2-a974-f4eedb0bf989\") " Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.976116 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-logs\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.976142 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-combined-ca-bundle\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.976171 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-config-data\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.976214 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-config-data-custom\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.976248 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-public-tls-certs\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.976275 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-internal-tls-certs\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.976299 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh2gr\" (UniqueName: \"kubernetes.io/projected/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-kube-api-access-nh2gr\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.976637 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c1a55d69-0992-4eb2-a974-f4eedb0bf989-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c1a55d69-0992-4eb2-a974-f4eedb0bf989" (UID: "c1a55d69-0992-4eb2-a974-f4eedb0bf989"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.978396 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-logs\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.983296 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-scripts" (OuterVolumeSpecName: "scripts") pod "c1a55d69-0992-4eb2-a974-f4eedb0bf989" (UID: "c1a55d69-0992-4eb2-a974-f4eedb0bf989"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.983429 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-config-data-custom\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.985102 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c1a55d69-0992-4eb2-a974-f4eedb0bf989" (UID: "c1a55d69-0992-4eb2-a974-f4eedb0bf989"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:02 crc kubenswrapper[4936]: I0930 13:58:02.985747 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-combined-ca-bundle\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.003795 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-config-data\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.004646 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-public-tls-certs\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.014862 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh2gr\" (UniqueName: \"kubernetes.io/projected/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-kube-api-access-nh2gr\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.022129 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1a55d69-0992-4eb2-a974-f4eedb0bf989-kube-api-access-95xt4" (OuterVolumeSpecName: "kube-api-access-95xt4") pod "c1a55d69-0992-4eb2-a974-f4eedb0bf989" (UID: "c1a55d69-0992-4eb2-a974-f4eedb0bf989"). InnerVolumeSpecName "kube-api-access-95xt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.032581 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lgmpr" event={"ID":"c1a55d69-0992-4eb2-a974-f4eedb0bf989","Type":"ContainerDied","Data":"de784979dddc7abc39e8e4853863e1aafc0ac0fa4c876bede69de5e89b968ebe"} Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.032845 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lgmpr" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.033099 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de784979dddc7abc39e8e4853863e1aafc0ac0fa4c876bede69de5e89b968ebe" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.038720 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68de6cb4-15c5-4c0e-b924-c2fff7f03eaf-internal-tls-certs\") pod \"barbican-api-75d6545796-72m9v\" (UID: \"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf\") " pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.051299 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1a55d69-0992-4eb2-a974-f4eedb0bf989" (UID: "c1a55d69-0992-4eb2-a974-f4eedb0bf989"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.070389 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-config-data" (OuterVolumeSpecName: "config-data") pod "c1a55d69-0992-4eb2-a974-f4eedb0bf989" (UID: "c1a55d69-0992-4eb2-a974-f4eedb0bf989"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.080069 4936 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c1a55d69-0992-4eb2-a974-f4eedb0bf989-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.080103 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.080112 4936 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.080121 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.080131 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95xt4\" (UniqueName: \"kubernetes.io/projected/c1a55d69-0992-4eb2-a974-f4eedb0bf989-kube-api-access-95xt4\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.080140 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1a55d69-0992-4eb2-a974-f4eedb0bf989-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.105419 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.142837 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:58:03 crc kubenswrapper[4936]: E0930 13:58:03.143268 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1a55d69-0992-4eb2-a974-f4eedb0bf989" containerName="cinder-db-sync" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.143292 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a55d69-0992-4eb2-a974-f4eedb0bf989" containerName="cinder-db-sync" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.143530 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1a55d69-0992-4eb2-a974-f4eedb0bf989" containerName="cinder-db-sync" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.144604 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.151077 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.176828 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.283684 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5b8e4b0c-49a0-4865-8b29-03b95f99080b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.283785 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.283901 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-scripts\") pod \"cinder-scheduler-0\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.283931 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.283949 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9vfw\" (UniqueName: \"kubernetes.io/projected/5b8e4b0c-49a0-4865-8b29-03b95f99080b-kube-api-access-f9vfw\") pod \"cinder-scheduler-0\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.283964 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-config-data\") pod \"cinder-scheduler-0\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.326026 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-nrzqt"] Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.379219 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-f26mj"] Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.380742 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.387648 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.387717 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-scripts\") pod \"cinder-scheduler-0\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.387750 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.387780 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9vfw\" (UniqueName: \"kubernetes.io/projected/5b8e4b0c-49a0-4865-8b29-03b95f99080b-kube-api-access-f9vfw\") pod \"cinder-scheduler-0\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.387797 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-config-data\") pod \"cinder-scheduler-0\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.387822 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5b8e4b0c-49a0-4865-8b29-03b95f99080b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.387918 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5b8e4b0c-49a0-4865-8b29-03b95f99080b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.396941 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.401102 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-scripts\") pod \"cinder-scheduler-0\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.420962 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.424514 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-config-data\") pod \"cinder-scheduler-0\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.444030 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-f26mj"] Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.465411 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.467146 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.472725 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.486068 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9vfw\" (UniqueName: \"kubernetes.io/projected/5b8e4b0c-49a0-4865-8b29-03b95f99080b-kube-api-access-f9vfw\") pod \"cinder-scheduler-0\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.489441 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-f26mj\" (UID: \"fe31c877-0e46-4a03-b18d-773f9487573d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.489602 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzx68\" (UniqueName: \"kubernetes.io/projected/fe31c877-0e46-4a03-b18d-773f9487573d-kube-api-access-kzx68\") pod \"dnsmasq-dns-6d97fcdd8f-f26mj\" (UID: \"fe31c877-0e46-4a03-b18d-773f9487573d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.489691 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-f26mj\" (UID: \"fe31c877-0e46-4a03-b18d-773f9487573d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.489785 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-f26mj\" (UID: \"fe31c877-0e46-4a03-b18d-773f9487573d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.489877 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-config\") pod \"dnsmasq-dns-6d97fcdd8f-f26mj\" (UID: \"fe31c877-0e46-4a03-b18d-773f9487573d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.496308 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.496767 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.593235 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.593429 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-config-data\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.593515 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-config-data-custom\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.593616 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d41ee30-fe43-444a-abac-9d430d8fec9a-logs\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.593688 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f58b\" (UniqueName: \"kubernetes.io/projected/2d41ee30-fe43-444a-abac-9d430d8fec9a-kube-api-access-9f58b\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.593781 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-f26mj\" (UID: \"fe31c877-0e46-4a03-b18d-773f9487573d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.596944 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzx68\" (UniqueName: \"kubernetes.io/projected/fe31c877-0e46-4a03-b18d-773f9487573d-kube-api-access-kzx68\") pod \"dnsmasq-dns-6d97fcdd8f-f26mj\" (UID: \"fe31c877-0e46-4a03-b18d-773f9487573d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.597063 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-scripts\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.597188 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-f26mj\" (UID: \"fe31c877-0e46-4a03-b18d-773f9487573d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.597269 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-f26mj\" (UID: \"fe31c877-0e46-4a03-b18d-773f9487573d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.597616 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d41ee30-fe43-444a-abac-9d430d8fec9a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.597765 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-config\") pod \"dnsmasq-dns-6d97fcdd8f-f26mj\" (UID: \"fe31c877-0e46-4a03-b18d-773f9487573d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.599093 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-f26mj\" (UID: \"fe31c877-0e46-4a03-b18d-773f9487573d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.599316 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-config\") pod \"dnsmasq-dns-6d97fcdd8f-f26mj\" (UID: \"fe31c877-0e46-4a03-b18d-773f9487573d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.599321 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-f26mj\" (UID: \"fe31c877-0e46-4a03-b18d-773f9487573d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.599546 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-f26mj\" (UID: \"fe31c877-0e46-4a03-b18d-773f9487573d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.654524 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzx68\" (UniqueName: \"kubernetes.io/projected/fe31c877-0e46-4a03-b18d-773f9487573d-kube-api-access-kzx68\") pod \"dnsmasq-dns-6d97fcdd8f-f26mj\" (UID: \"fe31c877-0e46-4a03-b18d-773f9487573d\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.699765 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-scripts\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.699957 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d41ee30-fe43-444a-abac-9d430d8fec9a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.700072 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.700133 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-config-data\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.700275 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-config-data-custom\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.700321 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d41ee30-fe43-444a-abac-9d430d8fec9a-logs\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.700364 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f58b\" (UniqueName: \"kubernetes.io/projected/2d41ee30-fe43-444a-abac-9d430d8fec9a-kube-api-access-9f58b\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.701976 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d41ee30-fe43-444a-abac-9d430d8fec9a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.702586 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d41ee30-fe43-444a-abac-9d430d8fec9a-logs\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.705424 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.707425 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-scripts\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.713809 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-config-data-custom\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.715738 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-config-data\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.728118 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f58b\" (UniqueName: \"kubernetes.io/projected/2d41ee30-fe43-444a-abac-9d430d8fec9a-kube-api-access-9f58b\") pod \"cinder-api-0\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " pod="openstack/cinder-api-0" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.857127 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:58:03 crc kubenswrapper[4936]: I0930 13:58:03.876233 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 13:58:04 crc kubenswrapper[4936]: I0930 13:58:04.039897 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" podUID="b40c969f-6127-4b8c-ab4a-0219e01329a4" containerName="dnsmasq-dns" containerID="cri-o://04e5e9242b738df4b426eadc9e35e2b53c51a95c856e6d7decc4673fc7c46c56" gracePeriod=10 Sep 30 13:58:04 crc kubenswrapper[4936]: I0930 13:58:04.747131 4936 scope.go:117] "RemoveContainer" containerID="fb8ff52d6600198d4659ed1694a19c15f549333e8a55ded769264fed44738967" Sep 30 13:58:04 crc kubenswrapper[4936]: E0930 13:58:04.747695 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb8ff52d6600198d4659ed1694a19c15f549333e8a55ded769264fed44738967\": container with ID starting with fb8ff52d6600198d4659ed1694a19c15f549333e8a55ded769264fed44738967 not found: ID does not exist" containerID="fb8ff52d6600198d4659ed1694a19c15f549333e8a55ded769264fed44738967" Sep 30 13:58:04 crc kubenswrapper[4936]: I0930 13:58:04.747737 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb8ff52d6600198d4659ed1694a19c15f549333e8a55ded769264fed44738967"} err="failed to get container status \"fb8ff52d6600198d4659ed1694a19c15f549333e8a55ded769264fed44738967\": rpc error: code = NotFound desc = could not find container \"fb8ff52d6600198d4659ed1694a19c15f549333e8a55ded769264fed44738967\": container with ID starting with fb8ff52d6600198d4659ed1694a19c15f549333e8a55ded769264fed44738967 not found: ID does not exist" Sep 30 13:58:04 crc kubenswrapper[4936]: I0930 13:58:04.747775 4936 scope.go:117] "RemoveContainer" containerID="e82bd5103bf242d00b3e8adf069c3daae2ff638465741cc7d3600bfa03c56ed1" Sep 30 13:58:04 crc kubenswrapper[4936]: E0930 13:58:04.748180 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e82bd5103bf242d00b3e8adf069c3daae2ff638465741cc7d3600bfa03c56ed1\": container with ID starting with e82bd5103bf242d00b3e8adf069c3daae2ff638465741cc7d3600bfa03c56ed1 not found: ID does not exist" containerID="e82bd5103bf242d00b3e8adf069c3daae2ff638465741cc7d3600bfa03c56ed1" Sep 30 13:58:04 crc kubenswrapper[4936]: I0930 13:58:04.748202 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82bd5103bf242d00b3e8adf069c3daae2ff638465741cc7d3600bfa03c56ed1"} err="failed to get container status \"e82bd5103bf242d00b3e8adf069c3daae2ff638465741cc7d3600bfa03c56ed1\": rpc error: code = NotFound desc = could not find container \"e82bd5103bf242d00b3e8adf069c3daae2ff638465741cc7d3600bfa03c56ed1\": container with ID starting with e82bd5103bf242d00b3e8adf069c3daae2ff638465741cc7d3600bfa03c56ed1 not found: ID does not exist" Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.095877 4936 generic.go:334] "Generic (PLEG): container finished" podID="b40c969f-6127-4b8c-ab4a-0219e01329a4" containerID="04e5e9242b738df4b426eadc9e35e2b53c51a95c856e6d7decc4673fc7c46c56" exitCode=0 Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.096238 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" event={"ID":"b40c969f-6127-4b8c-ab4a-0219e01329a4","Type":"ContainerDied","Data":"04e5e9242b738df4b426eadc9e35e2b53c51a95c856e6d7decc4673fc7c46c56"} Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.146041 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.238911 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-ovsdbserver-nb\") pod \"b40c969f-6127-4b8c-ab4a-0219e01329a4\" (UID: \"b40c969f-6127-4b8c-ab4a-0219e01329a4\") " Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.238980 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6dzq\" (UniqueName: \"kubernetes.io/projected/b40c969f-6127-4b8c-ab4a-0219e01329a4-kube-api-access-t6dzq\") pod \"b40c969f-6127-4b8c-ab4a-0219e01329a4\" (UID: \"b40c969f-6127-4b8c-ab4a-0219e01329a4\") " Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.239037 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-ovsdbserver-sb\") pod \"b40c969f-6127-4b8c-ab4a-0219e01329a4\" (UID: \"b40c969f-6127-4b8c-ab4a-0219e01329a4\") " Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.239135 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-config\") pod \"b40c969f-6127-4b8c-ab4a-0219e01329a4\" (UID: \"b40c969f-6127-4b8c-ab4a-0219e01329a4\") " Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.239289 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-dns-svc\") pod \"b40c969f-6127-4b8c-ab4a-0219e01329a4\" (UID: \"b40c969f-6127-4b8c-ab4a-0219e01329a4\") " Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.255573 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b40c969f-6127-4b8c-ab4a-0219e01329a4-kube-api-access-t6dzq" (OuterVolumeSpecName: "kube-api-access-t6dzq") pod "b40c969f-6127-4b8c-ab4a-0219e01329a4" (UID: "b40c969f-6127-4b8c-ab4a-0219e01329a4"). InnerVolumeSpecName "kube-api-access-t6dzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.342521 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6dzq\" (UniqueName: \"kubernetes.io/projected/b40c969f-6127-4b8c-ab4a-0219e01329a4-kube-api-access-t6dzq\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.371735 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-config" (OuterVolumeSpecName: "config") pod "b40c969f-6127-4b8c-ab4a-0219e01329a4" (UID: "b40c969f-6127-4b8c-ab4a-0219e01329a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.377800 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b40c969f-6127-4b8c-ab4a-0219e01329a4" (UID: "b40c969f-6127-4b8c-ab4a-0219e01329a4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.381890 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b40c969f-6127-4b8c-ab4a-0219e01329a4" (UID: "b40c969f-6127-4b8c-ab4a-0219e01329a4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.412074 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b40c969f-6127-4b8c-ab4a-0219e01329a4" (UID: "b40c969f-6127-4b8c-ab4a-0219e01329a4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.431383 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.444492 4936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.444520 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.444529 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.444538 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40c969f-6127-4b8c-ab4a-0219e01329a4-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.573592 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75d6545796-72m9v"] Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.590465 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:58:05 crc kubenswrapper[4936]: I0930 13:58:05.864595 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-f26mj"] Sep 30 13:58:05 crc kubenswrapper[4936]: W0930 13:58:05.868780 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe31c877_0e46_4a03_b18d_773f9487573d.slice/crio-ecba3779c60bbcd65e4904c2b8e477ecdb53621add086f676870b27396068744 WatchSource:0}: Error finding container ecba3779c60bbcd65e4904c2b8e477ecdb53621add086f676870b27396068744: Status 404 returned error can't find the container with id ecba3779c60bbcd65e4904c2b8e477ecdb53621add086f676870b27396068744 Sep 30 13:58:06 crc kubenswrapper[4936]: I0930 13:58:06.150740 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:58:06 crc kubenswrapper[4936]: I0930 13:58:06.161664 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"646cad35-c602-4015-b52f-9eca7cee1b80","Type":"ContainerStarted","Data":"812cf1e7ce24a1fb47882a9012427fd14321aa91c24a8789bdf8940b89381ca8"} Sep 30 13:58:06 crc kubenswrapper[4936]: I0930 13:58:06.173605 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" event={"ID":"6b5ee5ab-2208-44b0-a464-f813f6314c26","Type":"ContainerStarted","Data":"350e8cb4e53cbf3aa738ab3c6239fbb3b3facf8792b502f3f7555a57f7dcbaec"} Sep 30 13:58:06 crc kubenswrapper[4936]: I0930 13:58:06.182006 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-cf994c9f9-p76xq" event={"ID":"c089d3fc-0428-4e46-8796-efa4f3df1fb6","Type":"ContainerStarted","Data":"985831bff7a7329ac235d38e15626556f1a6d188f8f0728a4a2de0059e628ab6"} Sep 30 13:58:06 crc kubenswrapper[4936]: I0930 13:58:06.187449 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75d6545796-72m9v" event={"ID":"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf","Type":"ContainerStarted","Data":"aa4bbec493f47d6b6339f55b9d114e3c30336e2bc8cd1841402676d7ac1e8157"} Sep 30 13:58:06 crc kubenswrapper[4936]: I0930 13:58:06.187491 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75d6545796-72m9v" event={"ID":"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf","Type":"ContainerStarted","Data":"90771380c3499d98e9ed8fb8b3700f9243f019e4ae6c84b6eb834bff787fae14"} Sep 30 13:58:06 crc kubenswrapper[4936]: I0930 13:58:06.189464 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5b8e4b0c-49a0-4865-8b29-03b95f99080b","Type":"ContainerStarted","Data":"ffc2e0c79443e2870252788d15598b74f38812d8d6d9a24be099a02978adbb79"} Sep 30 13:58:06 crc kubenswrapper[4936]: I0930 13:58:06.207864 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2d41ee30-fe43-444a-abac-9d430d8fec9a","Type":"ContainerStarted","Data":"be07b830a4829b79011313be7118fc6386e5cfeccd8e59330e2d7d6b70714236"} Sep 30 13:58:06 crc kubenswrapper[4936]: I0930 13:58:06.227185 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" Sep 30 13:58:06 crc kubenswrapper[4936]: I0930 13:58:06.227449 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-nrzqt" event={"ID":"b40c969f-6127-4b8c-ab4a-0219e01329a4","Type":"ContainerDied","Data":"7f28b85c8953a676a47dd03c4752dd19bb09169510a42b9d37b9dd1d66890886"} Sep 30 13:58:06 crc kubenswrapper[4936]: I0930 13:58:06.227515 4936 scope.go:117] "RemoveContainer" containerID="04e5e9242b738df4b426eadc9e35e2b53c51a95c856e6d7decc4673fc7c46c56" Sep 30 13:58:06 crc kubenswrapper[4936]: I0930 13:58:06.253458 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" event={"ID":"fe31c877-0e46-4a03-b18d-773f9487573d","Type":"ContainerStarted","Data":"ecba3779c60bbcd65e4904c2b8e477ecdb53621add086f676870b27396068744"} Sep 30 13:58:06 crc kubenswrapper[4936]: I0930 13:58:06.378287 4936 scope.go:117] "RemoveContainer" containerID="729ef6a2bd04a1ccf44013d6560782b7c0d578c8f70f2c9099b2970361c9752c" Sep 30 13:58:06 crc kubenswrapper[4936]: I0930 13:58:06.391261 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-nrzqt"] Sep 30 13:58:06 crc kubenswrapper[4936]: I0930 13:58:06.401245 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-nrzqt"] Sep 30 13:58:07 crc kubenswrapper[4936]: I0930 13:58:07.297329 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2d41ee30-fe43-444a-abac-9d430d8fec9a","Type":"ContainerStarted","Data":"aee454167cde6c49f18835c493754516a4678bca1fa5dc722e55eb704bd8ff8b"} Sep 30 13:58:07 crc kubenswrapper[4936]: I0930 13:58:07.307286 4936 generic.go:334] "Generic (PLEG): container finished" podID="fe31c877-0e46-4a03-b18d-773f9487573d" containerID="ccccd5c293ea5698f4da4d7a6a763675657c1839a58d32a483144ddf1e731254" exitCode=0 Sep 30 13:58:07 crc kubenswrapper[4936]: I0930 13:58:07.307380 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" event={"ID":"fe31c877-0e46-4a03-b18d-773f9487573d","Type":"ContainerDied","Data":"ccccd5c293ea5698f4da4d7a6a763675657c1839a58d32a483144ddf1e731254"} Sep 30 13:58:07 crc kubenswrapper[4936]: I0930 13:58:07.348148 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" event={"ID":"6b5ee5ab-2208-44b0-a464-f813f6314c26","Type":"ContainerStarted","Data":"1212c51e2fac2a7535e509ad2b10497efec6bfcb50f9b6084acd6c7d88f27baf"} Sep 30 13:58:07 crc kubenswrapper[4936]: I0930 13:58:07.355676 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-cf994c9f9-p76xq" event={"ID":"c089d3fc-0428-4e46-8796-efa4f3df1fb6","Type":"ContainerStarted","Data":"8b9c885643fb4dcb4e71a19eff97d7cea2beab5566727b4d9aec7b3a25c77a19"} Sep 30 13:58:07 crc kubenswrapper[4936]: I0930 13:58:07.370809 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75d6545796-72m9v" event={"ID":"68de6cb4-15c5-4c0e-b924-c2fff7f03eaf","Type":"ContainerStarted","Data":"cd273db653ec619ad62b20bfab6a6b85e8e404843914ace7d084819604758f37"} Sep 30 13:58:07 crc kubenswrapper[4936]: I0930 13:58:07.371582 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:07 crc kubenswrapper[4936]: I0930 13:58:07.371693 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:07 crc kubenswrapper[4936]: I0930 13:58:07.380767 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-77c7874cdd-k7vml" podStartSLOduration=3.886768283 podStartE2EDuration="8.380744085s" podCreationTimestamp="2025-09-30 13:57:59 +0000 UTC" firstStartedPulling="2025-09-30 13:58:00.299985456 +0000 UTC m=+1130.683987757" lastFinishedPulling="2025-09-30 13:58:04.793961258 +0000 UTC m=+1135.177963559" observedRunningTime="2025-09-30 13:58:07.364477351 +0000 UTC m=+1137.748479652" watchObservedRunningTime="2025-09-30 13:58:07.380744085 +0000 UTC m=+1137.764746386" Sep 30 13:58:07 crc kubenswrapper[4936]: I0930 13:58:07.388646 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-cf994c9f9-p76xq" podStartSLOduration=4.828589754 podStartE2EDuration="9.38862809s" podCreationTimestamp="2025-09-30 13:57:58 +0000 UTC" firstStartedPulling="2025-09-30 13:58:00.36344589 +0000 UTC m=+1130.747448191" lastFinishedPulling="2025-09-30 13:58:04.923484226 +0000 UTC m=+1135.307486527" observedRunningTime="2025-09-30 13:58:07.386246045 +0000 UTC m=+1137.770248366" watchObservedRunningTime="2025-09-30 13:58:07.38862809 +0000 UTC m=+1137.772630391" Sep 30 13:58:07 crc kubenswrapper[4936]: I0930 13:58:07.435281 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-75d6545796-72m9v" podStartSLOduration=5.435261764 podStartE2EDuration="5.435261764s" podCreationTimestamp="2025-09-30 13:58:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:58:07.418169907 +0000 UTC m=+1137.802172208" watchObservedRunningTime="2025-09-30 13:58:07.435261764 +0000 UTC m=+1137.819264065" Sep 30 13:58:08 crc kubenswrapper[4936]: I0930 13:58:08.327947 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b40c969f-6127-4b8c-ab4a-0219e01329a4" path="/var/lib/kubelet/pods/b40c969f-6127-4b8c-ab4a-0219e01329a4/volumes" Sep 30 13:58:08 crc kubenswrapper[4936]: I0930 13:58:08.394619 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" event={"ID":"fe31c877-0e46-4a03-b18d-773f9487573d","Type":"ContainerStarted","Data":"26f9aa92815ecdda437d00df8e078ae0c2d085547d74710fbd35c9020cc63455"} Sep 30 13:58:08 crc kubenswrapper[4936]: I0930 13:58:08.395799 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:58:08 crc kubenswrapper[4936]: I0930 13:58:08.404024 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"646cad35-c602-4015-b52f-9eca7cee1b80","Type":"ContainerStarted","Data":"55c9ac4e3a68912741008db0585291dc6d0f7fb006db1a32bcb944baf9116b24"} Sep 30 13:58:08 crc kubenswrapper[4936]: I0930 13:58:08.406380 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5b8e4b0c-49a0-4865-8b29-03b95f99080b","Type":"ContainerStarted","Data":"e322af9b5fbc5bc3a3637ab3ead036a399ad9dc32b50796e5ed884a49cf525fc"} Sep 30 13:58:08 crc kubenswrapper[4936]: I0930 13:58:08.408445 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2d41ee30-fe43-444a-abac-9d430d8fec9a","Type":"ContainerStarted","Data":"57e80b32cd6a03c3d68d8e0380a8a8d7a4290f52a00ecfde0c075500e73996a8"} Sep 30 13:58:08 crc kubenswrapper[4936]: I0930 13:58:08.409825 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 13:58:08 crc kubenswrapper[4936]: I0930 13:58:08.408852 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2d41ee30-fe43-444a-abac-9d430d8fec9a" containerName="cinder-api" containerID="cri-o://57e80b32cd6a03c3d68d8e0380a8a8d7a4290f52a00ecfde0c075500e73996a8" gracePeriod=30 Sep 30 13:58:08 crc kubenswrapper[4936]: I0930 13:58:08.408801 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2d41ee30-fe43-444a-abac-9d430d8fec9a" containerName="cinder-api-log" containerID="cri-o://aee454167cde6c49f18835c493754516a4678bca1fa5dc722e55eb704bd8ff8b" gracePeriod=30 Sep 30 13:58:08 crc kubenswrapper[4936]: I0930 13:58:08.444353 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.444327087 podStartE2EDuration="5.444327087s" podCreationTimestamp="2025-09-30 13:58:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:58:08.441713745 +0000 UTC m=+1138.825716076" watchObservedRunningTime="2025-09-30 13:58:08.444327087 +0000 UTC m=+1138.828329378" Sep 30 13:58:08 crc kubenswrapper[4936]: I0930 13:58:08.449836 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" podStartSLOduration=5.449825157 podStartE2EDuration="5.449825157s" podCreationTimestamp="2025-09-30 13:58:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:58:08.420687131 +0000 UTC m=+1138.804689432" watchObservedRunningTime="2025-09-30 13:58:08.449825157 +0000 UTC m=+1138.833827458" Sep 30 13:58:09 crc kubenswrapper[4936]: I0930 13:58:09.437928 4936 generic.go:334] "Generic (PLEG): container finished" podID="2d41ee30-fe43-444a-abac-9d430d8fec9a" containerID="aee454167cde6c49f18835c493754516a4678bca1fa5dc722e55eb704bd8ff8b" exitCode=143 Sep 30 13:58:09 crc kubenswrapper[4936]: I0930 13:58:09.438173 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2d41ee30-fe43-444a-abac-9d430d8fec9a","Type":"ContainerDied","Data":"aee454167cde6c49f18835c493754516a4678bca1fa5dc722e55eb704bd8ff8b"} Sep 30 13:58:09 crc kubenswrapper[4936]: I0930 13:58:09.447525 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5b8e4b0c-49a0-4865-8b29-03b95f99080b","Type":"ContainerStarted","Data":"ffc735ee04dabfd1877bfa7270710e6ce4c26d411732d42b0740707a7ba9b82b"} Sep 30 13:58:09 crc kubenswrapper[4936]: I0930 13:58:09.470863 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.3278509849999995 podStartE2EDuration="6.470839995s" podCreationTimestamp="2025-09-30 13:58:03 +0000 UTC" firstStartedPulling="2025-09-30 13:58:05.609637388 +0000 UTC m=+1135.993639689" lastFinishedPulling="2025-09-30 13:58:06.752626398 +0000 UTC m=+1137.136628699" observedRunningTime="2025-09-30 13:58:09.465305564 +0000 UTC m=+1139.849307865" watchObservedRunningTime="2025-09-30 13:58:09.470839995 +0000 UTC m=+1139.854842296" Sep 30 13:58:10 crc kubenswrapper[4936]: I0930 13:58:10.459836 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"646cad35-c602-4015-b52f-9eca7cee1b80","Type":"ContainerStarted","Data":"9496281aa30011b1955836b8bd0c0d7b08fce2c19a234ec1af5aeaa3d789ed1f"} Sep 30 13:58:10 crc kubenswrapper[4936]: I0930 13:58:10.460245 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 13:58:12 crc kubenswrapper[4936]: I0930 13:58:12.043613 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:58:12 crc kubenswrapper[4936]: I0930 13:58:12.067293 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.542262948 podStartE2EDuration="13.067270986s" podCreationTimestamp="2025-09-30 13:57:59 +0000 UTC" firstStartedPulling="2025-09-30 13:58:00.728676756 +0000 UTC m=+1131.112679047" lastFinishedPulling="2025-09-30 13:58:09.253684784 +0000 UTC m=+1139.637687085" observedRunningTime="2025-09-30 13:58:10.487660829 +0000 UTC m=+1140.871663130" watchObservedRunningTime="2025-09-30 13:58:12.067270986 +0000 UTC m=+1142.451273287" Sep 30 13:58:12 crc kubenswrapper[4936]: I0930 13:58:12.122558 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:58:13 crc kubenswrapper[4936]: I0930 13:58:13.497788 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 13:58:13 crc kubenswrapper[4936]: I0930 13:58:13.859525 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:58:13 crc kubenswrapper[4936]: I0930 13:58:13.880010 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 13:58:13 crc kubenswrapper[4936]: I0930 13:58:13.910726 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-9c8pv"] Sep 30 13:58:13 crc kubenswrapper[4936]: I0930 13:58:13.910977 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" podUID="3f3b6274-31d7-4826-b554-0e1eadc5a811" containerName="dnsmasq-dns" containerID="cri-o://abf43715c04e37acc7230629354fa056b87f84678f8bd91800de34a00a2a92d7" gracePeriod=10 Sep 30 13:58:14 crc kubenswrapper[4936]: I0930 13:58:14.559172 4936 generic.go:334] "Generic (PLEG): container finished" podID="3f3b6274-31d7-4826-b554-0e1eadc5a811" containerID="abf43715c04e37acc7230629354fa056b87f84678f8bd91800de34a00a2a92d7" exitCode=0 Sep 30 13:58:14 crc kubenswrapper[4936]: I0930 13:58:14.560398 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" event={"ID":"3f3b6274-31d7-4826-b554-0e1eadc5a811","Type":"ContainerDied","Data":"abf43715c04e37acc7230629354fa056b87f84678f8bd91800de34a00a2a92d7"} Sep 30 13:58:14 crc kubenswrapper[4936]: I0930 13:58:14.726171 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:58:14 crc kubenswrapper[4936]: I0930 13:58:14.796802 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:58:14 crc kubenswrapper[4936]: I0930 13:58:14.899369 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-config\") pod \"3f3b6274-31d7-4826-b554-0e1eadc5a811\" (UID: \"3f3b6274-31d7-4826-b554-0e1eadc5a811\") " Sep 30 13:58:14 crc kubenswrapper[4936]: I0930 13:58:14.899431 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6n76\" (UniqueName: \"kubernetes.io/projected/3f3b6274-31d7-4826-b554-0e1eadc5a811-kube-api-access-t6n76\") pod \"3f3b6274-31d7-4826-b554-0e1eadc5a811\" (UID: \"3f3b6274-31d7-4826-b554-0e1eadc5a811\") " Sep 30 13:58:14 crc kubenswrapper[4936]: I0930 13:58:14.899541 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-ovsdbserver-nb\") pod \"3f3b6274-31d7-4826-b554-0e1eadc5a811\" (UID: \"3f3b6274-31d7-4826-b554-0e1eadc5a811\") " Sep 30 13:58:14 crc kubenswrapper[4936]: I0930 13:58:14.899570 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-dns-svc\") pod \"3f3b6274-31d7-4826-b554-0e1eadc5a811\" (UID: \"3f3b6274-31d7-4826-b554-0e1eadc5a811\") " Sep 30 13:58:14 crc kubenswrapper[4936]: I0930 13:58:14.899595 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-ovsdbserver-sb\") pod \"3f3b6274-31d7-4826-b554-0e1eadc5a811\" (UID: \"3f3b6274-31d7-4826-b554-0e1eadc5a811\") " Sep 30 13:58:14 crc kubenswrapper[4936]: I0930 13:58:14.920571 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f3b6274-31d7-4826-b554-0e1eadc5a811-kube-api-access-t6n76" (OuterVolumeSpecName: "kube-api-access-t6n76") pod "3f3b6274-31d7-4826-b554-0e1eadc5a811" (UID: "3f3b6274-31d7-4826-b554-0e1eadc5a811"). InnerVolumeSpecName "kube-api-access-t6n76". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4936]: I0930 13:58:15.001501 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6n76\" (UniqueName: \"kubernetes.io/projected/3f3b6274-31d7-4826-b554-0e1eadc5a811-kube-api-access-t6n76\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4936]: I0930 13:58:15.002157 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3f3b6274-31d7-4826-b554-0e1eadc5a811" (UID: "3f3b6274-31d7-4826-b554-0e1eadc5a811"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4936]: I0930 13:58:15.017137 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3f3b6274-31d7-4826-b554-0e1eadc5a811" (UID: "3f3b6274-31d7-4826-b554-0e1eadc5a811"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4936]: I0930 13:58:15.023771 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3f3b6274-31d7-4826-b554-0e1eadc5a811" (UID: "3f3b6274-31d7-4826-b554-0e1eadc5a811"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4936]: I0930 13:58:15.030982 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-config" (OuterVolumeSpecName: "config") pod "3f3b6274-31d7-4826-b554-0e1eadc5a811" (UID: "3f3b6274-31d7-4826-b554-0e1eadc5a811"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:15 crc kubenswrapper[4936]: I0930 13:58:15.103573 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4936]: I0930 13:58:15.103611 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4936]: I0930 13:58:15.103621 4936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4936]: I0930 13:58:15.103630 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f3b6274-31d7-4826-b554-0e1eadc5a811-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:15 crc kubenswrapper[4936]: I0930 13:58:15.570719 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" event={"ID":"3f3b6274-31d7-4826-b554-0e1eadc5a811","Type":"ContainerDied","Data":"5db9680b4104135b4b0d34a22bd1a8b52846110977b2257e903585b1391cdf1f"} Sep 30 13:58:15 crc kubenswrapper[4936]: I0930 13:58:15.570788 4936 scope.go:117] "RemoveContainer" containerID="abf43715c04e37acc7230629354fa056b87f84678f8bd91800de34a00a2a92d7" Sep 30 13:58:15 crc kubenswrapper[4936]: I0930 13:58:15.570923 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5b8e4b0c-49a0-4865-8b29-03b95f99080b" containerName="cinder-scheduler" containerID="cri-o://e322af9b5fbc5bc3a3637ab3ead036a399ad9dc32b50796e5ed884a49cf525fc" gracePeriod=30 Sep 30 13:58:15 crc kubenswrapper[4936]: I0930 13:58:15.570964 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5b8e4b0c-49a0-4865-8b29-03b95f99080b" containerName="probe" containerID="cri-o://ffc735ee04dabfd1877bfa7270710e6ce4c26d411732d42b0740707a7ba9b82b" gracePeriod=30 Sep 30 13:58:15 crc kubenswrapper[4936]: I0930 13:58:15.571382 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-9c8pv" Sep 30 13:58:15 crc kubenswrapper[4936]: I0930 13:58:15.616444 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-9c8pv"] Sep 30 13:58:15 crc kubenswrapper[4936]: I0930 13:58:15.625519 4936 scope.go:117] "RemoveContainer" containerID="7c7eaf225b1c2939e3bf6b2201713b3ad9d07721bf6c46cdf1b7c9c3b61d33d3" Sep 30 13:58:15 crc kubenswrapper[4936]: I0930 13:58:15.629319 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-9c8pv"] Sep 30 13:58:15 crc kubenswrapper[4936]: I0930 13:58:15.800194 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-564967b568-8jhvh" Sep 30 13:58:16 crc kubenswrapper[4936]: I0930 13:58:16.325592 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f3b6274-31d7-4826-b554-0e1eadc5a811" path="/var/lib/kubelet/pods/3f3b6274-31d7-4826-b554-0e1eadc5a811/volumes" Sep 30 13:58:16 crc kubenswrapper[4936]: I0930 13:58:16.636196 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-564967b568-8jhvh" Sep 30 13:58:16 crc kubenswrapper[4936]: I0930 13:58:16.970702 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-56f56f5fc4-snznt" podUID="63566983-fa75-492f-863c-862e222bbb67" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.149:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 13:58:17 crc kubenswrapper[4936]: I0930 13:58:17.163546 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56f56f5fc4-snznt" podUID="63566983-fa75-492f-863c-862e222bbb67" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.149:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 13:58:17 crc kubenswrapper[4936]: I0930 13:58:17.590104 4936 generic.go:334] "Generic (PLEG): container finished" podID="5b8e4b0c-49a0-4865-8b29-03b95f99080b" containerID="ffc735ee04dabfd1877bfa7270710e6ce4c26d411732d42b0740707a7ba9b82b" exitCode=0 Sep 30 13:58:17 crc kubenswrapper[4936]: I0930 13:58:17.590144 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5b8e4b0c-49a0-4865-8b29-03b95f99080b","Type":"ContainerDied","Data":"ffc735ee04dabfd1877bfa7270710e6ce4c26d411732d42b0740707a7ba9b82b"} Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.115493 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-75d6545796-72m9v" podUID="68de6cb4-15c5-4c0e-b924-c2fff7f03eaf" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.150:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.115994 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-75d6545796-72m9v" podUID="68de6cb4-15c5-4c0e-b924-c2fff7f03eaf" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.150:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.252538 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.252590 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.362188 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.497596 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9vfw\" (UniqueName: \"kubernetes.io/projected/5b8e4b0c-49a0-4865-8b29-03b95f99080b-kube-api-access-f9vfw\") pod \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.497660 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-config-data\") pod \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.497803 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5b8e4b0c-49a0-4865-8b29-03b95f99080b-etc-machine-id\") pod \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.497833 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-scripts\") pod \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.497883 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-config-data-custom\") pod \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.497870 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b8e4b0c-49a0-4865-8b29-03b95f99080b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5b8e4b0c-49a0-4865-8b29-03b95f99080b" (UID: "5b8e4b0c-49a0-4865-8b29-03b95f99080b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.497943 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-combined-ca-bundle\") pod \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\" (UID: \"5b8e4b0c-49a0-4865-8b29-03b95f99080b\") " Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.498409 4936 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5b8e4b0c-49a0-4865-8b29-03b95f99080b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.512637 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b8e4b0c-49a0-4865-8b29-03b95f99080b-kube-api-access-f9vfw" (OuterVolumeSpecName: "kube-api-access-f9vfw") pod "5b8e4b0c-49a0-4865-8b29-03b95f99080b" (UID: "5b8e4b0c-49a0-4865-8b29-03b95f99080b"). InnerVolumeSpecName "kube-api-access-f9vfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.513387 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-scripts" (OuterVolumeSpecName: "scripts") pod "5b8e4b0c-49a0-4865-8b29-03b95f99080b" (UID: "5b8e4b0c-49a0-4865-8b29-03b95f99080b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.531503 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5b8e4b0c-49a0-4865-8b29-03b95f99080b" (UID: "5b8e4b0c-49a0-4865-8b29-03b95f99080b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.600839 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9vfw\" (UniqueName: \"kubernetes.io/projected/5b8e4b0c-49a0-4865-8b29-03b95f99080b-kube-api-access-f9vfw\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.600869 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.600879 4936 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.605652 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b8e4b0c-49a0-4865-8b29-03b95f99080b" (UID: "5b8e4b0c-49a0-4865-8b29-03b95f99080b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.612898 4936 generic.go:334] "Generic (PLEG): container finished" podID="5b8e4b0c-49a0-4865-8b29-03b95f99080b" containerID="e322af9b5fbc5bc3a3637ab3ead036a399ad9dc32b50796e5ed884a49cf525fc" exitCode=0 Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.612959 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5b8e4b0c-49a0-4865-8b29-03b95f99080b","Type":"ContainerDied","Data":"e322af9b5fbc5bc3a3637ab3ead036a399ad9dc32b50796e5ed884a49cf525fc"} Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.612989 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5b8e4b0c-49a0-4865-8b29-03b95f99080b","Type":"ContainerDied","Data":"ffc2e0c79443e2870252788d15598b74f38812d8d6d9a24be099a02978adbb79"} Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.613030 4936 scope.go:117] "RemoveContainer" containerID="ffc735ee04dabfd1877bfa7270710e6ce4c26d411732d42b0740707a7ba9b82b" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.613215 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.683939 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-config-data" (OuterVolumeSpecName: "config-data") pod "5b8e4b0c-49a0-4865-8b29-03b95f99080b" (UID: "5b8e4b0c-49a0-4865-8b29-03b95f99080b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.700541 4936 scope.go:117] "RemoveContainer" containerID="e322af9b5fbc5bc3a3637ab3ead036a399ad9dc32b50796e5ed884a49cf525fc" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.703498 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.703539 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b8e4b0c-49a0-4865-8b29-03b95f99080b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.727572 4936 scope.go:117] "RemoveContainer" containerID="ffc735ee04dabfd1877bfa7270710e6ce4c26d411732d42b0740707a7ba9b82b" Sep 30 13:58:18 crc kubenswrapper[4936]: E0930 13:58:18.728105 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc735ee04dabfd1877bfa7270710e6ce4c26d411732d42b0740707a7ba9b82b\": container with ID starting with ffc735ee04dabfd1877bfa7270710e6ce4c26d411732d42b0740707a7ba9b82b not found: ID does not exist" containerID="ffc735ee04dabfd1877bfa7270710e6ce4c26d411732d42b0740707a7ba9b82b" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.728149 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc735ee04dabfd1877bfa7270710e6ce4c26d411732d42b0740707a7ba9b82b"} err="failed to get container status \"ffc735ee04dabfd1877bfa7270710e6ce4c26d411732d42b0740707a7ba9b82b\": rpc error: code = NotFound desc = could not find container \"ffc735ee04dabfd1877bfa7270710e6ce4c26d411732d42b0740707a7ba9b82b\": container with ID starting with ffc735ee04dabfd1877bfa7270710e6ce4c26d411732d42b0740707a7ba9b82b not found: ID does not exist" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.728176 4936 scope.go:117] "RemoveContainer" containerID="e322af9b5fbc5bc3a3637ab3ead036a399ad9dc32b50796e5ed884a49cf525fc" Sep 30 13:58:18 crc kubenswrapper[4936]: E0930 13:58:18.728848 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e322af9b5fbc5bc3a3637ab3ead036a399ad9dc32b50796e5ed884a49cf525fc\": container with ID starting with e322af9b5fbc5bc3a3637ab3ead036a399ad9dc32b50796e5ed884a49cf525fc not found: ID does not exist" containerID="e322af9b5fbc5bc3a3637ab3ead036a399ad9dc32b50796e5ed884a49cf525fc" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.728887 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e322af9b5fbc5bc3a3637ab3ead036a399ad9dc32b50796e5ed884a49cf525fc"} err="failed to get container status \"e322af9b5fbc5bc3a3637ab3ead036a399ad9dc32b50796e5ed884a49cf525fc\": rpc error: code = NotFound desc = could not find container \"e322af9b5fbc5bc3a3637ab3ead036a399ad9dc32b50796e5ed884a49cf525fc\": container with ID starting with e322af9b5fbc5bc3a3637ab3ead036a399ad9dc32b50796e5ed884a49cf525fc not found: ID does not exist" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.771997 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7f8b6d55dd-xrxpc" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.918562 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="2d41ee30-fe43-444a-abac-9d430d8fec9a" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.153:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.971902 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:58:18 crc kubenswrapper[4936]: I0930 13:58:18.987724 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.060158 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:58:19 crc kubenswrapper[4936]: E0930 13:58:19.060559 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b8e4b0c-49a0-4865-8b29-03b95f99080b" containerName="probe" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.060577 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b8e4b0c-49a0-4865-8b29-03b95f99080b" containerName="probe" Sep 30 13:58:19 crc kubenswrapper[4936]: E0930 13:58:19.060585 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b40c969f-6127-4b8c-ab4a-0219e01329a4" containerName="dnsmasq-dns" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.060591 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b40c969f-6127-4b8c-ab4a-0219e01329a4" containerName="dnsmasq-dns" Sep 30 13:58:19 crc kubenswrapper[4936]: E0930 13:58:19.060611 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f3b6274-31d7-4826-b554-0e1eadc5a811" containerName="dnsmasq-dns" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.060617 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f3b6274-31d7-4826-b554-0e1eadc5a811" containerName="dnsmasq-dns" Sep 30 13:58:19 crc kubenswrapper[4936]: E0930 13:58:19.060629 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b40c969f-6127-4b8c-ab4a-0219e01329a4" containerName="init" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.060635 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b40c969f-6127-4b8c-ab4a-0219e01329a4" containerName="init" Sep 30 13:58:19 crc kubenswrapper[4936]: E0930 13:58:19.060655 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f3b6274-31d7-4826-b554-0e1eadc5a811" containerName="init" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.060660 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f3b6274-31d7-4826-b554-0e1eadc5a811" containerName="init" Sep 30 13:58:19 crc kubenswrapper[4936]: E0930 13:58:19.060672 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b8e4b0c-49a0-4865-8b29-03b95f99080b" containerName="cinder-scheduler" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.060679 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b8e4b0c-49a0-4865-8b29-03b95f99080b" containerName="cinder-scheduler" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.060853 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b8e4b0c-49a0-4865-8b29-03b95f99080b" containerName="cinder-scheduler" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.060871 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b40c969f-6127-4b8c-ab4a-0219e01329a4" containerName="dnsmasq-dns" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.060883 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f3b6274-31d7-4826-b554-0e1eadc5a811" containerName="dnsmasq-dns" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.060895 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b8e4b0c-49a0-4865-8b29-03b95f99080b" containerName="probe" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.061766 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.073995 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.098133 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.214852 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dbca145-2e46-484f-9676-17bde0b6fe26-scripts\") pod \"cinder-scheduler-0\" (UID: \"8dbca145-2e46-484f-9676-17bde0b6fe26\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.214935 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkchn\" (UniqueName: \"kubernetes.io/projected/8dbca145-2e46-484f-9676-17bde0b6fe26-kube-api-access-pkchn\") pod \"cinder-scheduler-0\" (UID: \"8dbca145-2e46-484f-9676-17bde0b6fe26\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.215158 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8dbca145-2e46-484f-9676-17bde0b6fe26-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8dbca145-2e46-484f-9676-17bde0b6fe26\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.215245 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8dbca145-2e46-484f-9676-17bde0b6fe26-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8dbca145-2e46-484f-9676-17bde0b6fe26\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.215658 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dbca145-2e46-484f-9676-17bde0b6fe26-config-data\") pod \"cinder-scheduler-0\" (UID: \"8dbca145-2e46-484f-9676-17bde0b6fe26\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.215849 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dbca145-2e46-484f-9676-17bde0b6fe26-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8dbca145-2e46-484f-9676-17bde0b6fe26\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.317938 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dbca145-2e46-484f-9676-17bde0b6fe26-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8dbca145-2e46-484f-9676-17bde0b6fe26\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.317990 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dbca145-2e46-484f-9676-17bde0b6fe26-scripts\") pod \"cinder-scheduler-0\" (UID: \"8dbca145-2e46-484f-9676-17bde0b6fe26\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.318023 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkchn\" (UniqueName: \"kubernetes.io/projected/8dbca145-2e46-484f-9676-17bde0b6fe26-kube-api-access-pkchn\") pod \"cinder-scheduler-0\" (UID: \"8dbca145-2e46-484f-9676-17bde0b6fe26\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.318071 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8dbca145-2e46-484f-9676-17bde0b6fe26-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8dbca145-2e46-484f-9676-17bde0b6fe26\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.318149 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8dbca145-2e46-484f-9676-17bde0b6fe26-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8dbca145-2e46-484f-9676-17bde0b6fe26\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.318271 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dbca145-2e46-484f-9676-17bde0b6fe26-config-data\") pod \"cinder-scheduler-0\" (UID: \"8dbca145-2e46-484f-9676-17bde0b6fe26\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.318811 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8dbca145-2e46-484f-9676-17bde0b6fe26-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8dbca145-2e46-484f-9676-17bde0b6fe26\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.321431 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8dbca145-2e46-484f-9676-17bde0b6fe26-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8dbca145-2e46-484f-9676-17bde0b6fe26\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.321674 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dbca145-2e46-484f-9676-17bde0b6fe26-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8dbca145-2e46-484f-9676-17bde0b6fe26\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.322225 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dbca145-2e46-484f-9676-17bde0b6fe26-config-data\") pod \"cinder-scheduler-0\" (UID: \"8dbca145-2e46-484f-9676-17bde0b6fe26\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.338842 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dbca145-2e46-484f-9676-17bde0b6fe26-scripts\") pod \"cinder-scheduler-0\" (UID: \"8dbca145-2e46-484f-9676-17bde0b6fe26\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.356474 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.357722 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.370496 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.374901 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkchn\" (UniqueName: \"kubernetes.io/projected/8dbca145-2e46-484f-9676-17bde0b6fe26-kube-api-access-pkchn\") pod \"cinder-scheduler-0\" (UID: \"8dbca145-2e46-484f-9676-17bde0b6fe26\") " pod="openstack/cinder-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.375038 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.375355 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-zp8f8" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.375356 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.381716 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.523491 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0555978-f34e-4ada-9e39-513b4c199109-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a0555978-f34e-4ada-9e39-513b4c199109\") " pod="openstack/openstackclient" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.523573 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0555978-f34e-4ada-9e39-513b4c199109-openstack-config-secret\") pod \"openstackclient\" (UID: \"a0555978-f34e-4ada-9e39-513b4c199109\") " pod="openstack/openstackclient" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.523672 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0555978-f34e-4ada-9e39-513b4c199109-openstack-config\") pod \"openstackclient\" (UID: \"a0555978-f34e-4ada-9e39-513b4c199109\") " pod="openstack/openstackclient" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.523793 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv94d\" (UniqueName: \"kubernetes.io/projected/a0555978-f34e-4ada-9e39-513b4c199109-kube-api-access-kv94d\") pod \"openstackclient\" (UID: \"a0555978-f34e-4ada-9e39-513b4c199109\") " pod="openstack/openstackclient" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.627968 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0555978-f34e-4ada-9e39-513b4c199109-openstack-config\") pod \"openstackclient\" (UID: \"a0555978-f34e-4ada-9e39-513b4c199109\") " pod="openstack/openstackclient" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.628067 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv94d\" (UniqueName: \"kubernetes.io/projected/a0555978-f34e-4ada-9e39-513b4c199109-kube-api-access-kv94d\") pod \"openstackclient\" (UID: \"a0555978-f34e-4ada-9e39-513b4c199109\") " pod="openstack/openstackclient" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.628173 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0555978-f34e-4ada-9e39-513b4c199109-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a0555978-f34e-4ada-9e39-513b4c199109\") " pod="openstack/openstackclient" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.628213 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0555978-f34e-4ada-9e39-513b4c199109-openstack-config-secret\") pod \"openstackclient\" (UID: \"a0555978-f34e-4ada-9e39-513b4c199109\") " pod="openstack/openstackclient" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.628793 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0555978-f34e-4ada-9e39-513b4c199109-openstack-config\") pod \"openstackclient\" (UID: \"a0555978-f34e-4ada-9e39-513b4c199109\") " pod="openstack/openstackclient" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.636554 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0555978-f34e-4ada-9e39-513b4c199109-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a0555978-f34e-4ada-9e39-513b4c199109\") " pod="openstack/openstackclient" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.641814 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0555978-f34e-4ada-9e39-513b4c199109-openstack-config-secret\") pod \"openstackclient\" (UID: \"a0555978-f34e-4ada-9e39-513b4c199109\") " pod="openstack/openstackclient" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.652909 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv94d\" (UniqueName: \"kubernetes.io/projected/a0555978-f34e-4ada-9e39-513b4c199109-kube-api-access-kv94d\") pod \"openstackclient\" (UID: \"a0555978-f34e-4ada-9e39-513b4c199109\") " pod="openstack/openstackclient" Sep 30 13:58:19 crc kubenswrapper[4936]: I0930 13:58:19.822900 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 13:58:20 crc kubenswrapper[4936]: I0930 13:58:20.121150 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 13:58:20 crc kubenswrapper[4936]: W0930 13:58:20.145107 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dbca145_2e46_484f_9676_17bde0b6fe26.slice/crio-53ea340762c6915877b64b4168ffce85ab40dcb4b15a7d26dc89fe3a94014f76 WatchSource:0}: Error finding container 53ea340762c6915877b64b4168ffce85ab40dcb4b15a7d26dc89fe3a94014f76: Status 404 returned error can't find the container with id 53ea340762c6915877b64b4168ffce85ab40dcb4b15a7d26dc89fe3a94014f76 Sep 30 13:58:20 crc kubenswrapper[4936]: I0930 13:58:20.346007 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b8e4b0c-49a0-4865-8b29-03b95f99080b" path="/var/lib/kubelet/pods/5b8e4b0c-49a0-4865-8b29-03b95f99080b/volumes" Sep 30 13:58:20 crc kubenswrapper[4936]: I0930 13:58:20.555785 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 13:58:20 crc kubenswrapper[4936]: I0930 13:58:20.658287 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8dbca145-2e46-484f-9676-17bde0b6fe26","Type":"ContainerStarted","Data":"53ea340762c6915877b64b4168ffce85ab40dcb4b15a7d26dc89fe3a94014f76"} Sep 30 13:58:20 crc kubenswrapper[4936]: I0930 13:58:20.659833 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a0555978-f34e-4ada-9e39-513b4c199109","Type":"ContainerStarted","Data":"28da3731f2ae4ad946f1bd65a4b14c0ce43a6f4f52502ec0f8642ba45ce451b1"} Sep 30 13:58:21 crc kubenswrapper[4936]: I0930 13:58:21.466558 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:21 crc kubenswrapper[4936]: I0930 13:58:21.682956 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8dbca145-2e46-484f-9676-17bde0b6fe26","Type":"ContainerStarted","Data":"a2f913f366fe00a3d6553e109e5f021ca949f3f75065929ee02b061c827b6a05"} Sep 30 13:58:21 crc kubenswrapper[4936]: I0930 13:58:21.976620 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-75d6545796-72m9v" podUID="68de6cb4-15c5-4c0e-b924-c2fff7f03eaf" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.150:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 13:58:22 crc kubenswrapper[4936]: I0930 13:58:22.700902 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8dbca145-2e46-484f-9676-17bde0b6fe26","Type":"ContainerStarted","Data":"3b63f4bfa1b410489070a5bef039c52e4eb6c5bd48e5ea3ecca59240d891c997"} Sep 30 13:58:22 crc kubenswrapper[4936]: I0930 13:58:22.724583 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.724563188 podStartE2EDuration="4.724563188s" podCreationTimestamp="2025-09-30 13:58:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:58:22.718361499 +0000 UTC m=+1153.102363800" watchObservedRunningTime="2025-09-30 13:58:22.724563188 +0000 UTC m=+1153.108565489" Sep 30 13:58:22 crc kubenswrapper[4936]: I0930 13:58:22.855371 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75d6545796-72m9v" Sep 30 13:58:22 crc kubenswrapper[4936]: I0930 13:58:22.998591 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56f56f5fc4-snznt"] Sep 30 13:58:22 crc kubenswrapper[4936]: I0930 13:58:22.998858 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56f56f5fc4-snznt" podUID="63566983-fa75-492f-863c-862e222bbb67" containerName="barbican-api-log" containerID="cri-o://427d28070ec8d01eb7af421d3a7d281dafce4866ea0f3dfbabe37b039e9b1054" gracePeriod=30 Sep 30 13:58:22 crc kubenswrapper[4936]: I0930 13:58:22.999183 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56f56f5fc4-snznt" podUID="63566983-fa75-492f-863c-862e222bbb67" containerName="barbican-api" containerID="cri-o://1f84acfa121cac1d033126aa96a946b6103b7bc8c4f0eff142b379c80fafc53b" gracePeriod=30 Sep 30 13:58:23 crc kubenswrapper[4936]: I0930 13:58:23.713128 4936 generic.go:334] "Generic (PLEG): container finished" podID="63566983-fa75-492f-863c-862e222bbb67" containerID="427d28070ec8d01eb7af421d3a7d281dafce4866ea0f3dfbabe37b039e9b1054" exitCode=143 Sep 30 13:58:23 crc kubenswrapper[4936]: I0930 13:58:23.713312 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56f56f5fc4-snznt" event={"ID":"63566983-fa75-492f-863c-862e222bbb67","Type":"ContainerDied","Data":"427d28070ec8d01eb7af421d3a7d281dafce4866ea0f3dfbabe37b039e9b1054"} Sep 30 13:58:23 crc kubenswrapper[4936]: I0930 13:58:23.959561 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="2d41ee30-fe43-444a-abac-9d430d8fec9a" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.153:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 13:58:24 crc kubenswrapper[4936]: I0930 13:58:24.382297 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 13:58:26 crc kubenswrapper[4936]: I0930 13:58:26.381555 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Sep 30 13:58:26 crc kubenswrapper[4936]: I0930 13:58:26.451562 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:58:26 crc kubenswrapper[4936]: I0930 13:58:26.600810 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56f56f5fc4-snznt" podUID="63566983-fa75-492f-863c-862e222bbb67" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.149:9311/healthcheck\": read tcp 10.217.0.2:60724->10.217.0.149:9311: read: connection reset by peer" Sep 30 13:58:26 crc kubenswrapper[4936]: I0930 13:58:26.605285 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56f56f5fc4-snznt" podUID="63566983-fa75-492f-863c-862e222bbb67" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.149:9311/healthcheck\": read tcp 10.217.0.2:60710->10.217.0.149:9311: read: connection reset by peer" Sep 30 13:58:26 crc kubenswrapper[4936]: I0930 13:58:26.780405 4936 generic.go:334] "Generic (PLEG): container finished" podID="63566983-fa75-492f-863c-862e222bbb67" containerID="1f84acfa121cac1d033126aa96a946b6103b7bc8c4f0eff142b379c80fafc53b" exitCode=0 Sep 30 13:58:26 crc kubenswrapper[4936]: I0930 13:58:26.780714 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56f56f5fc4-snznt" event={"ID":"63566983-fa75-492f-863c-862e222bbb67","Type":"ContainerDied","Data":"1f84acfa121cac1d033126aa96a946b6103b7bc8c4f0eff142b379c80fafc53b"} Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.185638 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.313252 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63566983-fa75-492f-863c-862e222bbb67-logs\") pod \"63566983-fa75-492f-863c-862e222bbb67\" (UID: \"63566983-fa75-492f-863c-862e222bbb67\") " Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.313301 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63566983-fa75-492f-863c-862e222bbb67-config-data\") pod \"63566983-fa75-492f-863c-862e222bbb67\" (UID: \"63566983-fa75-492f-863c-862e222bbb67\") " Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.313381 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63566983-fa75-492f-863c-862e222bbb67-config-data-custom\") pod \"63566983-fa75-492f-863c-862e222bbb67\" (UID: \"63566983-fa75-492f-863c-862e222bbb67\") " Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.313447 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwb6k\" (UniqueName: \"kubernetes.io/projected/63566983-fa75-492f-863c-862e222bbb67-kube-api-access-gwb6k\") pod \"63566983-fa75-492f-863c-862e222bbb67\" (UID: \"63566983-fa75-492f-863c-862e222bbb67\") " Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.313539 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63566983-fa75-492f-863c-862e222bbb67-combined-ca-bundle\") pod \"63566983-fa75-492f-863c-862e222bbb67\" (UID: \"63566983-fa75-492f-863c-862e222bbb67\") " Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.314755 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63566983-fa75-492f-863c-862e222bbb67-logs" (OuterVolumeSpecName: "logs") pod "63566983-fa75-492f-863c-862e222bbb67" (UID: "63566983-fa75-492f-863c-862e222bbb67"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.320722 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63566983-fa75-492f-863c-862e222bbb67-kube-api-access-gwb6k" (OuterVolumeSpecName: "kube-api-access-gwb6k") pod "63566983-fa75-492f-863c-862e222bbb67" (UID: "63566983-fa75-492f-863c-862e222bbb67"). InnerVolumeSpecName "kube-api-access-gwb6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.322695 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63566983-fa75-492f-863c-862e222bbb67-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "63566983-fa75-492f-863c-862e222bbb67" (UID: "63566983-fa75-492f-863c-862e222bbb67"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.415246 4936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63566983-fa75-492f-863c-862e222bbb67-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.415528 4936 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63566983-fa75-492f-863c-862e222bbb67-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.415542 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwb6k\" (UniqueName: \"kubernetes.io/projected/63566983-fa75-492f-863c-862e222bbb67-kube-api-access-gwb6k\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.434642 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63566983-fa75-492f-863c-862e222bbb67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63566983-fa75-492f-863c-862e222bbb67" (UID: "63566983-fa75-492f-863c-862e222bbb67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.477678 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63566983-fa75-492f-863c-862e222bbb67-config-data" (OuterVolumeSpecName: "config-data") pod "63566983-fa75-492f-863c-862e222bbb67" (UID: "63566983-fa75-492f-863c-862e222bbb67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.516700 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63566983-fa75-492f-863c-862e222bbb67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.516731 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63566983-fa75-492f-863c-862e222bbb67-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.821891 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56f56f5fc4-snznt" event={"ID":"63566983-fa75-492f-863c-862e222bbb67","Type":"ContainerDied","Data":"224e07695121b6d5f944934db2b9edf17424712fc188b046c3b86650d472722a"} Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.821923 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56f56f5fc4-snznt" Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.821976 4936 scope.go:117] "RemoveContainer" containerID="1f84acfa121cac1d033126aa96a946b6103b7bc8c4f0eff142b379c80fafc53b" Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.876699 4936 scope.go:117] "RemoveContainer" containerID="427d28070ec8d01eb7af421d3a7d281dafce4866ea0f3dfbabe37b039e9b1054" Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.889828 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56f56f5fc4-snznt"] Sep 30 13:58:27 crc kubenswrapper[4936]: I0930 13:58:27.898922 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-56f56f5fc4-snznt"] Sep 30 13:58:28 crc kubenswrapper[4936]: I0930 13:58:28.327777 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63566983-fa75-492f-863c-862e222bbb67" path="/var/lib/kubelet/pods/63566983-fa75-492f-863c-862e222bbb67/volumes" Sep 30 13:58:28 crc kubenswrapper[4936]: I0930 13:58:28.983983 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-78b5b9486f-frfk9" Sep 30 13:58:29 crc kubenswrapper[4936]: I0930 13:58:29.144391 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7dc6569b9d-8n4lf"] Sep 30 13:58:29 crc kubenswrapper[4936]: I0930 13:58:29.144617 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7dc6569b9d-8n4lf" podUID="99fde8c0-65fb-426a-afd1-85f27f8e63ea" containerName="neutron-api" containerID="cri-o://57c1df51b386ffdad128fd5e625d06654ff54c7a1a075f96efc024b1350f3ae9" gracePeriod=30 Sep 30 13:58:29 crc kubenswrapper[4936]: I0930 13:58:29.144986 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7dc6569b9d-8n4lf" podUID="99fde8c0-65fb-426a-afd1-85f27f8e63ea" containerName="neutron-httpd" containerID="cri-o://73eb5ade2febe4faf83d4100fbbbad33237364a7f685850e870edceb1cf6a5f3" gracePeriod=30 Sep 30 13:58:29 crc kubenswrapper[4936]: I0930 13:58:29.765234 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 13:58:29 crc kubenswrapper[4936]: I0930 13:58:29.848538 4936 generic.go:334] "Generic (PLEG): container finished" podID="99fde8c0-65fb-426a-afd1-85f27f8e63ea" containerID="73eb5ade2febe4faf83d4100fbbbad33237364a7f685850e870edceb1cf6a5f3" exitCode=0 Sep 30 13:58:29 crc kubenswrapper[4936]: I0930 13:58:29.848779 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dc6569b9d-8n4lf" event={"ID":"99fde8c0-65fb-426a-afd1-85f27f8e63ea","Type":"ContainerDied","Data":"73eb5ade2febe4faf83d4100fbbbad33237364a7f685850e870edceb1cf6a5f3"} Sep 30 13:58:29 crc kubenswrapper[4936]: I0930 13:58:29.905282 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 13:58:31 crc kubenswrapper[4936]: I0930 13:58:31.876287 4936 generic.go:334] "Generic (PLEG): container finished" podID="99fde8c0-65fb-426a-afd1-85f27f8e63ea" containerID="57c1df51b386ffdad128fd5e625d06654ff54c7a1a075f96efc024b1350f3ae9" exitCode=0 Sep 30 13:58:31 crc kubenswrapper[4936]: I0930 13:58:31.876482 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dc6569b9d-8n4lf" event={"ID":"99fde8c0-65fb-426a-afd1-85f27f8e63ea","Type":"ContainerDied","Data":"57c1df51b386ffdad128fd5e625d06654ff54c7a1a075f96efc024b1350f3ae9"} Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.709743 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.822720 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-ovndb-tls-certs\") pod \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\" (UID: \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\") " Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.823874 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s7b9\" (UniqueName: \"kubernetes.io/projected/99fde8c0-65fb-426a-afd1-85f27f8e63ea-kube-api-access-7s7b9\") pod \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\" (UID: \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\") " Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.824032 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-combined-ca-bundle\") pod \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\" (UID: \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\") " Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.824527 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-httpd-config\") pod \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\" (UID: \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\") " Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.824713 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-config\") pod \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\" (UID: \"99fde8c0-65fb-426a-afd1-85f27f8e63ea\") " Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.828314 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99fde8c0-65fb-426a-afd1-85f27f8e63ea-kube-api-access-7s7b9" (OuterVolumeSpecName: "kube-api-access-7s7b9") pod "99fde8c0-65fb-426a-afd1-85f27f8e63ea" (UID: "99fde8c0-65fb-426a-afd1-85f27f8e63ea"). InnerVolumeSpecName "kube-api-access-7s7b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.835892 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "99fde8c0-65fb-426a-afd1-85f27f8e63ea" (UID: "99fde8c0-65fb-426a-afd1-85f27f8e63ea"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.912120 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99fde8c0-65fb-426a-afd1-85f27f8e63ea" (UID: "99fde8c0-65fb-426a-afd1-85f27f8e63ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.913511 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-config" (OuterVolumeSpecName: "config") pod "99fde8c0-65fb-426a-afd1-85f27f8e63ea" (UID: "99fde8c0-65fb-426a-afd1-85f27f8e63ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.915711 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "99fde8c0-65fb-426a-afd1-85f27f8e63ea" (UID: "99fde8c0-65fb-426a-afd1-85f27f8e63ea"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.928111 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dc6569b9d-8n4lf" event={"ID":"99fde8c0-65fb-426a-afd1-85f27f8e63ea","Type":"ContainerDied","Data":"760b0db3966ede9fd9846efd55b2bbfb35a6f8e0f959d3a8571854c0ccd77cf5"} Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.928216 4936 scope.go:117] "RemoveContainer" containerID="73eb5ade2febe4faf83d4100fbbbad33237364a7f685850e870edceb1cf6a5f3" Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.928456 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7dc6569b9d-8n4lf" Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.931780 4936 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.931829 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s7b9\" (UniqueName: \"kubernetes.io/projected/99fde8c0-65fb-426a-afd1-85f27f8e63ea-kube-api-access-7s7b9\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.934604 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.934642 4936 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-httpd-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.934654 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/99fde8c0-65fb-426a-afd1-85f27f8e63ea-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.934762 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a0555978-f34e-4ada-9e39-513b4c199109","Type":"ContainerStarted","Data":"674a90df9f0f000df0fe59027c22ca3f573aec6fc33d1f516bf249202f562f19"} Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.966406 4936 scope.go:117] "RemoveContainer" containerID="57c1df51b386ffdad128fd5e625d06654ff54c7a1a075f96efc024b1350f3ae9" Sep 30 13:58:35 crc kubenswrapper[4936]: I0930 13:58:35.970959 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.16652555 podStartE2EDuration="16.970936139s" podCreationTimestamp="2025-09-30 13:58:19 +0000 UTC" firstStartedPulling="2025-09-30 13:58:20.485407835 +0000 UTC m=+1150.869410136" lastFinishedPulling="2025-09-30 13:58:35.289818424 +0000 UTC m=+1165.673820725" observedRunningTime="2025-09-30 13:58:35.9625545 +0000 UTC m=+1166.346556821" watchObservedRunningTime="2025-09-30 13:58:35.970936139 +0000 UTC m=+1166.354938440" Sep 30 13:58:36 crc kubenswrapper[4936]: I0930 13:58:36.000773 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7dc6569b9d-8n4lf"] Sep 30 13:58:36 crc kubenswrapper[4936]: I0930 13:58:36.016158 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7dc6569b9d-8n4lf"] Sep 30 13:58:36 crc kubenswrapper[4936]: I0930 13:58:36.294725 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:58:36 crc kubenswrapper[4936]: I0930 13:58:36.295376 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="646cad35-c602-4015-b52f-9eca7cee1b80" containerName="proxy-httpd" containerID="cri-o://9496281aa30011b1955836b8bd0c0d7b08fce2c19a234ec1af5aeaa3d789ed1f" gracePeriod=30 Sep 30 13:58:36 crc kubenswrapper[4936]: I0930 13:58:36.295400 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="646cad35-c602-4015-b52f-9eca7cee1b80" containerName="ceilometer-notification-agent" containerID="cri-o://812cf1e7ce24a1fb47882a9012427fd14321aa91c24a8789bdf8940b89381ca8" gracePeriod=30 Sep 30 13:58:36 crc kubenswrapper[4936]: I0930 13:58:36.295399 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="646cad35-c602-4015-b52f-9eca7cee1b80" containerName="sg-core" containerID="cri-o://55c9ac4e3a68912741008db0585291dc6d0f7fb006db1a32bcb944baf9116b24" gracePeriod=30 Sep 30 13:58:36 crc kubenswrapper[4936]: I0930 13:58:36.295577 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="646cad35-c602-4015-b52f-9eca7cee1b80" containerName="ceilometer-central-agent" containerID="cri-o://acbc08ae19a83f60b9265e0da70c020236f78569dd381f65aada8da5f871106a" gracePeriod=30 Sep 30 13:58:36 crc kubenswrapper[4936]: I0930 13:58:36.328798 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99fde8c0-65fb-426a-afd1-85f27f8e63ea" path="/var/lib/kubelet/pods/99fde8c0-65fb-426a-afd1-85f27f8e63ea/volumes" Sep 30 13:58:36 crc kubenswrapper[4936]: I0930 13:58:36.944798 4936 generic.go:334] "Generic (PLEG): container finished" podID="646cad35-c602-4015-b52f-9eca7cee1b80" containerID="9496281aa30011b1955836b8bd0c0d7b08fce2c19a234ec1af5aeaa3d789ed1f" exitCode=0 Sep 30 13:58:36 crc kubenswrapper[4936]: I0930 13:58:36.944831 4936 generic.go:334] "Generic (PLEG): container finished" podID="646cad35-c602-4015-b52f-9eca7cee1b80" containerID="55c9ac4e3a68912741008db0585291dc6d0f7fb006db1a32bcb944baf9116b24" exitCode=2 Sep 30 13:58:36 crc kubenswrapper[4936]: I0930 13:58:36.944838 4936 generic.go:334] "Generic (PLEG): container finished" podID="646cad35-c602-4015-b52f-9eca7cee1b80" containerID="acbc08ae19a83f60b9265e0da70c020236f78569dd381f65aada8da5f871106a" exitCode=0 Sep 30 13:58:36 crc kubenswrapper[4936]: I0930 13:58:36.945642 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"646cad35-c602-4015-b52f-9eca7cee1b80","Type":"ContainerDied","Data":"9496281aa30011b1955836b8bd0c0d7b08fce2c19a234ec1af5aeaa3d789ed1f"} Sep 30 13:58:36 crc kubenswrapper[4936]: I0930 13:58:36.945667 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"646cad35-c602-4015-b52f-9eca7cee1b80","Type":"ContainerDied","Data":"55c9ac4e3a68912741008db0585291dc6d0f7fb006db1a32bcb944baf9116b24"} Sep 30 13:58:36 crc kubenswrapper[4936]: I0930 13:58:36.945676 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"646cad35-c602-4015-b52f-9eca7cee1b80","Type":"ContainerDied","Data":"acbc08ae19a83f60b9265e0da70c020236f78569dd381f65aada8da5f871106a"} Sep 30 13:58:38 crc kubenswrapper[4936]: I0930 13:58:38.911460 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 13:58:38 crc kubenswrapper[4936]: I0930 13:58:38.960458 4936 generic.go:334] "Generic (PLEG): container finished" podID="2d41ee30-fe43-444a-abac-9d430d8fec9a" containerID="57e80b32cd6a03c3d68d8e0380a8a8d7a4290f52a00ecfde0c075500e73996a8" exitCode=137 Sep 30 13:58:38 crc kubenswrapper[4936]: I0930 13:58:38.960502 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2d41ee30-fe43-444a-abac-9d430d8fec9a","Type":"ContainerDied","Data":"57e80b32cd6a03c3d68d8e0380a8a8d7a4290f52a00ecfde0c075500e73996a8"} Sep 30 13:58:38 crc kubenswrapper[4936]: I0930 13:58:38.960532 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2d41ee30-fe43-444a-abac-9d430d8fec9a","Type":"ContainerDied","Data":"be07b830a4829b79011313be7118fc6386e5cfeccd8e59330e2d7d6b70714236"} Sep 30 13:58:38 crc kubenswrapper[4936]: I0930 13:58:38.960550 4936 scope.go:117] "RemoveContainer" containerID="57e80b32cd6a03c3d68d8e0380a8a8d7a4290f52a00ecfde0c075500e73996a8" Sep 30 13:58:38 crc kubenswrapper[4936]: I0930 13:58:38.960687 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.005397 4936 scope.go:117] "RemoveContainer" containerID="aee454167cde6c49f18835c493754516a4678bca1fa5dc722e55eb704bd8ff8b" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.038167 4936 scope.go:117] "RemoveContainer" containerID="57e80b32cd6a03c3d68d8e0380a8a8d7a4290f52a00ecfde0c075500e73996a8" Sep 30 13:58:39 crc kubenswrapper[4936]: E0930 13:58:39.038560 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57e80b32cd6a03c3d68d8e0380a8a8d7a4290f52a00ecfde0c075500e73996a8\": container with ID starting with 57e80b32cd6a03c3d68d8e0380a8a8d7a4290f52a00ecfde0c075500e73996a8 not found: ID does not exist" containerID="57e80b32cd6a03c3d68d8e0380a8a8d7a4290f52a00ecfde0c075500e73996a8" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.038593 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57e80b32cd6a03c3d68d8e0380a8a8d7a4290f52a00ecfde0c075500e73996a8"} err="failed to get container status \"57e80b32cd6a03c3d68d8e0380a8a8d7a4290f52a00ecfde0c075500e73996a8\": rpc error: code = NotFound desc = could not find container \"57e80b32cd6a03c3d68d8e0380a8a8d7a4290f52a00ecfde0c075500e73996a8\": container with ID starting with 57e80b32cd6a03c3d68d8e0380a8a8d7a4290f52a00ecfde0c075500e73996a8 not found: ID does not exist" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.038643 4936 scope.go:117] "RemoveContainer" containerID="aee454167cde6c49f18835c493754516a4678bca1fa5dc722e55eb704bd8ff8b" Sep 30 13:58:39 crc kubenswrapper[4936]: E0930 13:58:39.039386 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aee454167cde6c49f18835c493754516a4678bca1fa5dc722e55eb704bd8ff8b\": container with ID starting with aee454167cde6c49f18835c493754516a4678bca1fa5dc722e55eb704bd8ff8b not found: ID does not exist" containerID="aee454167cde6c49f18835c493754516a4678bca1fa5dc722e55eb704bd8ff8b" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.039447 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aee454167cde6c49f18835c493754516a4678bca1fa5dc722e55eb704bd8ff8b"} err="failed to get container status \"aee454167cde6c49f18835c493754516a4678bca1fa5dc722e55eb704bd8ff8b\": rpc error: code = NotFound desc = could not find container \"aee454167cde6c49f18835c493754516a4678bca1fa5dc722e55eb704bd8ff8b\": container with ID starting with aee454167cde6c49f18835c493754516a4678bca1fa5dc722e55eb704bd8ff8b not found: ID does not exist" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.094017 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d41ee30-fe43-444a-abac-9d430d8fec9a-etc-machine-id\") pod \"2d41ee30-fe43-444a-abac-9d430d8fec9a\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.094069 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-config-data\") pod \"2d41ee30-fe43-444a-abac-9d430d8fec9a\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.094122 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-combined-ca-bundle\") pod \"2d41ee30-fe43-444a-abac-9d430d8fec9a\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.094147 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-scripts\") pod \"2d41ee30-fe43-444a-abac-9d430d8fec9a\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.094165 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f58b\" (UniqueName: \"kubernetes.io/projected/2d41ee30-fe43-444a-abac-9d430d8fec9a-kube-api-access-9f58b\") pod \"2d41ee30-fe43-444a-abac-9d430d8fec9a\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.094194 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d41ee30-fe43-444a-abac-9d430d8fec9a-logs\") pod \"2d41ee30-fe43-444a-abac-9d430d8fec9a\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.094218 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d41ee30-fe43-444a-abac-9d430d8fec9a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2d41ee30-fe43-444a-abac-9d430d8fec9a" (UID: "2d41ee30-fe43-444a-abac-9d430d8fec9a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.094239 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-config-data-custom\") pod \"2d41ee30-fe43-444a-abac-9d430d8fec9a\" (UID: \"2d41ee30-fe43-444a-abac-9d430d8fec9a\") " Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.094923 4936 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d41ee30-fe43-444a-abac-9d430d8fec9a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.095571 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d41ee30-fe43-444a-abac-9d430d8fec9a-logs" (OuterVolumeSpecName: "logs") pod "2d41ee30-fe43-444a-abac-9d430d8fec9a" (UID: "2d41ee30-fe43-444a-abac-9d430d8fec9a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.111125 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-scripts" (OuterVolumeSpecName: "scripts") pod "2d41ee30-fe43-444a-abac-9d430d8fec9a" (UID: "2d41ee30-fe43-444a-abac-9d430d8fec9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.111214 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2d41ee30-fe43-444a-abac-9d430d8fec9a" (UID: "2d41ee30-fe43-444a-abac-9d430d8fec9a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.123533 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d41ee30-fe43-444a-abac-9d430d8fec9a-kube-api-access-9f58b" (OuterVolumeSpecName: "kube-api-access-9f58b") pod "2d41ee30-fe43-444a-abac-9d430d8fec9a" (UID: "2d41ee30-fe43-444a-abac-9d430d8fec9a"). InnerVolumeSpecName "kube-api-access-9f58b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.159505 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d41ee30-fe43-444a-abac-9d430d8fec9a" (UID: "2d41ee30-fe43-444a-abac-9d430d8fec9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.173357 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-config-data" (OuterVolumeSpecName: "config-data") pod "2d41ee30-fe43-444a-abac-9d430d8fec9a" (UID: "2d41ee30-fe43-444a-abac-9d430d8fec9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.200950 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.201071 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f58b\" (UniqueName: \"kubernetes.io/projected/2d41ee30-fe43-444a-abac-9d430d8fec9a-kube-api-access-9f58b\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.201092 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.201105 4936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d41ee30-fe43-444a-abac-9d430d8fec9a-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.201116 4936 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.201126 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d41ee30-fe43-444a-abac-9d430d8fec9a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.293351 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.301897 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.322634 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:58:39 crc kubenswrapper[4936]: E0930 13:58:39.323012 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63566983-fa75-492f-863c-862e222bbb67" containerName="barbican-api" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.323030 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="63566983-fa75-492f-863c-862e222bbb67" containerName="barbican-api" Sep 30 13:58:39 crc kubenswrapper[4936]: E0930 13:58:39.323040 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d41ee30-fe43-444a-abac-9d430d8fec9a" containerName="cinder-api" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.323046 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d41ee30-fe43-444a-abac-9d430d8fec9a" containerName="cinder-api" Sep 30 13:58:39 crc kubenswrapper[4936]: E0930 13:58:39.323063 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99fde8c0-65fb-426a-afd1-85f27f8e63ea" containerName="neutron-api" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.323069 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="99fde8c0-65fb-426a-afd1-85f27f8e63ea" containerName="neutron-api" Sep 30 13:58:39 crc kubenswrapper[4936]: E0930 13:58:39.323080 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99fde8c0-65fb-426a-afd1-85f27f8e63ea" containerName="neutron-httpd" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.323085 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="99fde8c0-65fb-426a-afd1-85f27f8e63ea" containerName="neutron-httpd" Sep 30 13:58:39 crc kubenswrapper[4936]: E0930 13:58:39.323111 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63566983-fa75-492f-863c-862e222bbb67" containerName="barbican-api-log" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.323118 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="63566983-fa75-492f-863c-862e222bbb67" containerName="barbican-api-log" Sep 30 13:58:39 crc kubenswrapper[4936]: E0930 13:58:39.323139 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d41ee30-fe43-444a-abac-9d430d8fec9a" containerName="cinder-api-log" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.323146 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d41ee30-fe43-444a-abac-9d430d8fec9a" containerName="cinder-api-log" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.323328 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="99fde8c0-65fb-426a-afd1-85f27f8e63ea" containerName="neutron-httpd" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.323359 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="63566983-fa75-492f-863c-862e222bbb67" containerName="barbican-api" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.323370 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="63566983-fa75-492f-863c-862e222bbb67" containerName="barbican-api-log" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.323378 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d41ee30-fe43-444a-abac-9d430d8fec9a" containerName="cinder-api-log" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.323385 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d41ee30-fe43-444a-abac-9d430d8fec9a" containerName="cinder-api" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.323398 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="99fde8c0-65fb-426a-afd1-85f27f8e63ea" containerName="neutron-api" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.324270 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.327491 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.327701 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.327811 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.346563 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.507818 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-logs\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.508166 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8t9z\" (UniqueName: \"kubernetes.io/projected/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-kube-api-access-l8t9z\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.508220 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-public-tls-certs\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.508376 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.508454 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-config-data-custom\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.508757 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.508839 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.508928 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-scripts\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.509003 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-config-data\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.610405 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.610808 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.610858 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-scripts\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.610894 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-config-data\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.610928 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-logs\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.610997 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8t9z\" (UniqueName: \"kubernetes.io/projected/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-kube-api-access-l8t9z\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.611025 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-public-tls-certs\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.611080 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.611124 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-config-data-custom\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.611811 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.611883 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-logs\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.615943 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-config-data-custom\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.618606 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-scripts\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.619448 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-config-data\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.619632 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-public-tls-certs\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.620295 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.637242 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8t9z\" (UniqueName: \"kubernetes.io/projected/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-kube-api-access-l8t9z\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.641499 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5b20b7-ae2f-4d19-9f5f-f4c4404868aa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa\") " pod="openstack/cinder-api-0" Sep 30 13:58:39 crc kubenswrapper[4936]: I0930 13:58:39.644379 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 13:58:40 crc kubenswrapper[4936]: I0930 13:58:40.081607 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 13:58:40 crc kubenswrapper[4936]: W0930 13:58:40.089154 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda5b20b7_ae2f_4d19_9f5f_f4c4404868aa.slice/crio-e36a95926727c3723e46965ad897e25f494ec66488229cd392eb8a20a9b1d115 WatchSource:0}: Error finding container e36a95926727c3723e46965ad897e25f494ec66488229cd392eb8a20a9b1d115: Status 404 returned error can't find the container with id e36a95926727c3723e46965ad897e25f494ec66488229cd392eb8a20a9b1d115 Sep 30 13:58:40 crc kubenswrapper[4936]: I0930 13:58:40.326449 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d41ee30-fe43-444a-abac-9d430d8fec9a" path="/var/lib/kubelet/pods/2d41ee30-fe43-444a-abac-9d430d8fec9a/volumes" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.007879 4936 generic.go:334] "Generic (PLEG): container finished" podID="646cad35-c602-4015-b52f-9eca7cee1b80" containerID="812cf1e7ce24a1fb47882a9012427fd14321aa91c24a8789bdf8940b89381ca8" exitCode=0 Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.007959 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"646cad35-c602-4015-b52f-9eca7cee1b80","Type":"ContainerDied","Data":"812cf1e7ce24a1fb47882a9012427fd14321aa91c24a8789bdf8940b89381ca8"} Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.031145 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-59tv8"] Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.038183 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-59tv8" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.050024 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa","Type":"ContainerStarted","Data":"c0f95bb490403e7341120996cd4c1dc985e2dd7c40ddb9097d4ff5d9ca31943b"} Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.050100 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa","Type":"ContainerStarted","Data":"e36a95926727c3723e46965ad897e25f494ec66488229cd392eb8a20a9b1d115"} Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.054362 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-59tv8"] Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.143666 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv7tk\" (UniqueName: \"kubernetes.io/projected/f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3-kube-api-access-mv7tk\") pod \"nova-api-db-create-59tv8\" (UID: \"f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3\") " pod="openstack/nova-api-db-create-59tv8" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.187310 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.232051 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-d8pq5"] Sep 30 13:58:41 crc kubenswrapper[4936]: E0930 13:58:41.232377 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="646cad35-c602-4015-b52f-9eca7cee1b80" containerName="sg-core" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.232392 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="646cad35-c602-4015-b52f-9eca7cee1b80" containerName="sg-core" Sep 30 13:58:41 crc kubenswrapper[4936]: E0930 13:58:41.232409 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="646cad35-c602-4015-b52f-9eca7cee1b80" containerName="ceilometer-notification-agent" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.232416 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="646cad35-c602-4015-b52f-9eca7cee1b80" containerName="ceilometer-notification-agent" Sep 30 13:58:41 crc kubenswrapper[4936]: E0930 13:58:41.232434 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="646cad35-c602-4015-b52f-9eca7cee1b80" containerName="proxy-httpd" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.232440 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="646cad35-c602-4015-b52f-9eca7cee1b80" containerName="proxy-httpd" Sep 30 13:58:41 crc kubenswrapper[4936]: E0930 13:58:41.232447 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="646cad35-c602-4015-b52f-9eca7cee1b80" containerName="ceilometer-central-agent" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.232452 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="646cad35-c602-4015-b52f-9eca7cee1b80" containerName="ceilometer-central-agent" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.232598 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="646cad35-c602-4015-b52f-9eca7cee1b80" containerName="ceilometer-central-agent" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.232611 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="646cad35-c602-4015-b52f-9eca7cee1b80" containerName="ceilometer-notification-agent" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.232626 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="646cad35-c602-4015-b52f-9eca7cee1b80" containerName="sg-core" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.232638 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="646cad35-c602-4015-b52f-9eca7cee1b80" containerName="proxy-httpd" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.233121 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-d8pq5" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.245484 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv7tk\" (UniqueName: \"kubernetes.io/projected/f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3-kube-api-access-mv7tk\") pod \"nova-api-db-create-59tv8\" (UID: \"f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3\") " pod="openstack/nova-api-db-create-59tv8" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.249515 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-d8pq5"] Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.278927 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv7tk\" (UniqueName: \"kubernetes.io/projected/f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3-kube-api-access-mv7tk\") pod \"nova-api-db-create-59tv8\" (UID: \"f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3\") " pod="openstack/nova-api-db-create-59tv8" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.322028 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-wb9vk"] Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.325818 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wb9vk" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.338031 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wb9vk"] Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.346384 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/646cad35-c602-4015-b52f-9eca7cee1b80-log-httpd\") pod \"646cad35-c602-4015-b52f-9eca7cee1b80\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.346461 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-scripts\") pod \"646cad35-c602-4015-b52f-9eca7cee1b80\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.346537 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-combined-ca-bundle\") pod \"646cad35-c602-4015-b52f-9eca7cee1b80\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.346627 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqb46\" (UniqueName: \"kubernetes.io/projected/646cad35-c602-4015-b52f-9eca7cee1b80-kube-api-access-kqb46\") pod \"646cad35-c602-4015-b52f-9eca7cee1b80\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.346655 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-config-data\") pod \"646cad35-c602-4015-b52f-9eca7cee1b80\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.346676 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/646cad35-c602-4015-b52f-9eca7cee1b80-run-httpd\") pod \"646cad35-c602-4015-b52f-9eca7cee1b80\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.346735 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-sg-core-conf-yaml\") pod \"646cad35-c602-4015-b52f-9eca7cee1b80\" (UID: \"646cad35-c602-4015-b52f-9eca7cee1b80\") " Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.347013 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5slb\" (UniqueName: \"kubernetes.io/projected/4c5e56b1-fb0d-411b-b8dc-3fe9605ac342-kube-api-access-d5slb\") pod \"nova-cell0-db-create-d8pq5\" (UID: \"4c5e56b1-fb0d-411b-b8dc-3fe9605ac342\") " pod="openstack/nova-cell0-db-create-d8pq5" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.347956 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/646cad35-c602-4015-b52f-9eca7cee1b80-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "646cad35-c602-4015-b52f-9eca7cee1b80" (UID: "646cad35-c602-4015-b52f-9eca7cee1b80"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.348062 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/646cad35-c602-4015-b52f-9eca7cee1b80-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "646cad35-c602-4015-b52f-9eca7cee1b80" (UID: "646cad35-c602-4015-b52f-9eca7cee1b80"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.372966 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/646cad35-c602-4015-b52f-9eca7cee1b80-kube-api-access-kqb46" (OuterVolumeSpecName: "kube-api-access-kqb46") pod "646cad35-c602-4015-b52f-9eca7cee1b80" (UID: "646cad35-c602-4015-b52f-9eca7cee1b80"). InnerVolumeSpecName "kube-api-access-kqb46". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.373551 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-scripts" (OuterVolumeSpecName: "scripts") pod "646cad35-c602-4015-b52f-9eca7cee1b80" (UID: "646cad35-c602-4015-b52f-9eca7cee1b80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.401507 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "646cad35-c602-4015-b52f-9eca7cee1b80" (UID: "646cad35-c602-4015-b52f-9eca7cee1b80"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.454631 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5slb\" (UniqueName: \"kubernetes.io/projected/4c5e56b1-fb0d-411b-b8dc-3fe9605ac342-kube-api-access-d5slb\") pod \"nova-cell0-db-create-d8pq5\" (UID: \"4c5e56b1-fb0d-411b-b8dc-3fe9605ac342\") " pod="openstack/nova-cell0-db-create-d8pq5" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.454838 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j78kk\" (UniqueName: \"kubernetes.io/projected/cbeaa979-4b15-4193-be6d-3691224ecb0c-kube-api-access-j78kk\") pod \"nova-cell1-db-create-wb9vk\" (UID: \"cbeaa979-4b15-4193-be6d-3691224ecb0c\") " pod="openstack/nova-cell1-db-create-wb9vk" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.468261 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqb46\" (UniqueName: \"kubernetes.io/projected/646cad35-c602-4015-b52f-9eca7cee1b80-kube-api-access-kqb46\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.468357 4936 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/646cad35-c602-4015-b52f-9eca7cee1b80-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.468373 4936 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.468383 4936 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/646cad35-c602-4015-b52f-9eca7cee1b80-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.468396 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.478415 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-59tv8" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.502870 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5slb\" (UniqueName: \"kubernetes.io/projected/4c5e56b1-fb0d-411b-b8dc-3fe9605ac342-kube-api-access-d5slb\") pod \"nova-cell0-db-create-d8pq5\" (UID: \"4c5e56b1-fb0d-411b-b8dc-3fe9605ac342\") " pod="openstack/nova-cell0-db-create-d8pq5" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.550777 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-d8pq5" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.553488 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "646cad35-c602-4015-b52f-9eca7cee1b80" (UID: "646cad35-c602-4015-b52f-9eca7cee1b80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.570877 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j78kk\" (UniqueName: \"kubernetes.io/projected/cbeaa979-4b15-4193-be6d-3691224ecb0c-kube-api-access-j78kk\") pod \"nova-cell1-db-create-wb9vk\" (UID: \"cbeaa979-4b15-4193-be6d-3691224ecb0c\") " pod="openstack/nova-cell1-db-create-wb9vk" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.571416 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.591403 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j78kk\" (UniqueName: \"kubernetes.io/projected/cbeaa979-4b15-4193-be6d-3691224ecb0c-kube-api-access-j78kk\") pod \"nova-cell1-db-create-wb9vk\" (UID: \"cbeaa979-4b15-4193-be6d-3691224ecb0c\") " pod="openstack/nova-cell1-db-create-wb9vk" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.598562 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-config-data" (OuterVolumeSpecName: "config-data") pod "646cad35-c602-4015-b52f-9eca7cee1b80" (UID: "646cad35-c602-4015-b52f-9eca7cee1b80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.675635 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/646cad35-c602-4015-b52f-9eca7cee1b80-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:41 crc kubenswrapper[4936]: I0930 13:58:41.680149 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wb9vk" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.086615 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-59tv8"] Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.093264 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"646cad35-c602-4015-b52f-9eca7cee1b80","Type":"ContainerDied","Data":"2e6c6480f56ac613d8780e80e514445db9ebd5eabef4b18bacb620edddf2fe0e"} Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.093327 4936 scope.go:117] "RemoveContainer" containerID="9496281aa30011b1955836b8bd0c0d7b08fce2c19a234ec1af5aeaa3d789ed1f" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.093555 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.151170 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.158229 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.215990 4936 scope.go:117] "RemoveContainer" containerID="55c9ac4e3a68912741008db0585291dc6d0f7fb006db1a32bcb944baf9116b24" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.238437 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.240669 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.248285 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.248659 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.257527 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.282006 4936 scope.go:117] "RemoveContainer" containerID="812cf1e7ce24a1fb47882a9012427fd14321aa91c24a8789bdf8940b89381ca8" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.289099 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wb9vk"] Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.295652 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-d8pq5"] Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.322126 4936 scope.go:117] "RemoveContainer" containerID="acbc08ae19a83f60b9265e0da70c020236f78569dd381f65aada8da5f871106a" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.333285 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="646cad35-c602-4015-b52f-9eca7cee1b80" path="/var/lib/kubelet/pods/646cad35-c602-4015-b52f-9eca7cee1b80/volumes" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.399724 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-config-data\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.399780 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.399925 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5442165b-1df8-41c0-9118-7573222c1c27-run-httpd\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.400014 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-scripts\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.400067 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbw7z\" (UniqueName: \"kubernetes.io/projected/5442165b-1df8-41c0-9118-7573222c1c27-kube-api-access-lbw7z\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.400105 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5442165b-1df8-41c0-9118-7573222c1c27-log-httpd\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.400123 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.502000 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbw7z\" (UniqueName: \"kubernetes.io/projected/5442165b-1df8-41c0-9118-7573222c1c27-kube-api-access-lbw7z\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.502607 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5442165b-1df8-41c0-9118-7573222c1c27-log-httpd\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.502753 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.503017 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-config-data\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.503758 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.503950 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5442165b-1df8-41c0-9118-7573222c1c27-run-httpd\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.504113 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-scripts\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.503187 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5442165b-1df8-41c0-9118-7573222c1c27-log-httpd\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.504684 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5442165b-1df8-41c0-9118-7573222c1c27-run-httpd\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.510004 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-config-data\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.510359 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.510483 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.512989 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-scripts\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.537270 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbw7z\" (UniqueName: \"kubernetes.io/projected/5442165b-1df8-41c0-9118-7573222c1c27-kube-api-access-lbw7z\") pod \"ceilometer-0\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.574898 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:58:42 crc kubenswrapper[4936]: I0930 13:58:42.583317 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:58:43 crc kubenswrapper[4936]: I0930 13:58:43.136831 4936 generic.go:334] "Generic (PLEG): container finished" podID="f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3" containerID="6d6ad0923bf9e3994acea602c1802c2f92683e20ba7a5872cf75577f89d9aa14" exitCode=0 Sep 30 13:58:43 crc kubenswrapper[4936]: I0930 13:58:43.137119 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-59tv8" event={"ID":"f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3","Type":"ContainerDied","Data":"6d6ad0923bf9e3994acea602c1802c2f92683e20ba7a5872cf75577f89d9aa14"} Sep 30 13:58:43 crc kubenswrapper[4936]: I0930 13:58:43.137144 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-59tv8" event={"ID":"f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3","Type":"ContainerStarted","Data":"03ac8176a92db88b7173aecb59083d5cca6a9bad9980b30f7022e98fd9a15b01"} Sep 30 13:58:43 crc kubenswrapper[4936]: I0930 13:58:43.144790 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:58:43 crc kubenswrapper[4936]: I0930 13:58:43.156927 4936 generic.go:334] "Generic (PLEG): container finished" podID="4c5e56b1-fb0d-411b-b8dc-3fe9605ac342" containerID="2591f592f6078e6eb57d59bfca4a11bc4b410e53954de87773a95924d1f0c8c6" exitCode=0 Sep 30 13:58:43 crc kubenswrapper[4936]: I0930 13:58:43.157011 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-d8pq5" event={"ID":"4c5e56b1-fb0d-411b-b8dc-3fe9605ac342","Type":"ContainerDied","Data":"2591f592f6078e6eb57d59bfca4a11bc4b410e53954de87773a95924d1f0c8c6"} Sep 30 13:58:43 crc kubenswrapper[4936]: I0930 13:58:43.157038 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-d8pq5" event={"ID":"4c5e56b1-fb0d-411b-b8dc-3fe9605ac342","Type":"ContainerStarted","Data":"ab9ca161f374356ea291775252af7b390cfa08f5894916c39ce3c986b7a0ac34"} Sep 30 13:58:43 crc kubenswrapper[4936]: I0930 13:58:43.221159 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"da5b20b7-ae2f-4d19-9f5f-f4c4404868aa","Type":"ContainerStarted","Data":"96406263ca7dfaf7ae37acd078a537dc62e5a29ccdd958b992c5760d207af9ed"} Sep 30 13:58:43 crc kubenswrapper[4936]: I0930 13:58:43.221414 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 13:58:43 crc kubenswrapper[4936]: I0930 13:58:43.228661 4936 generic.go:334] "Generic (PLEG): container finished" podID="cbeaa979-4b15-4193-be6d-3691224ecb0c" containerID="9ebce1b3502f7a87b1305a833b007f180cd900cc76923316de984e091c10f761" exitCode=0 Sep 30 13:58:43 crc kubenswrapper[4936]: I0930 13:58:43.228706 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wb9vk" event={"ID":"cbeaa979-4b15-4193-be6d-3691224ecb0c","Type":"ContainerDied","Data":"9ebce1b3502f7a87b1305a833b007f180cd900cc76923316de984e091c10f761"} Sep 30 13:58:43 crc kubenswrapper[4936]: I0930 13:58:43.228733 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wb9vk" event={"ID":"cbeaa979-4b15-4193-be6d-3691224ecb0c","Type":"ContainerStarted","Data":"c417a05845a5b3c444df3610f7ad68723caa67a71528261dfdf4b95d269f2e49"} Sep 30 13:58:43 crc kubenswrapper[4936]: I0930 13:58:43.238845 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.238829259 podStartE2EDuration="4.238829259s" podCreationTimestamp="2025-09-30 13:58:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:58:43.236942998 +0000 UTC m=+1173.620945299" watchObservedRunningTime="2025-09-30 13:58:43.238829259 +0000 UTC m=+1173.622831550" Sep 30 13:58:43 crc kubenswrapper[4936]: I0930 13:58:43.877592 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="2d41ee30-fe43-444a-abac-9d430d8fec9a" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.153:8776/healthcheck\": dial tcp 10.217.0.153:8776: i/o timeout (Client.Timeout exceeded while awaiting headers)" Sep 30 13:58:44 crc kubenswrapper[4936]: I0930 13:58:44.239235 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5442165b-1df8-41c0-9118-7573222c1c27","Type":"ContainerStarted","Data":"30c5926d839127feba6cadb17cc1dc9c4a968c79122640a8d9a323c58144fa83"} Sep 30 13:58:44 crc kubenswrapper[4936]: I0930 13:58:44.242742 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5442165b-1df8-41c0-9118-7573222c1c27","Type":"ContainerStarted","Data":"381b25a2c42774afd79067fed416fbd2f6274f6ce09a774287af059b1b35ddc2"} Sep 30 13:58:44 crc kubenswrapper[4936]: I0930 13:58:44.655998 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-59tv8" Sep 30 13:58:44 crc kubenswrapper[4936]: I0930 13:58:44.773524 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv7tk\" (UniqueName: \"kubernetes.io/projected/f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3-kube-api-access-mv7tk\") pod \"f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3\" (UID: \"f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3\") " Sep 30 13:58:44 crc kubenswrapper[4936]: I0930 13:58:44.783562 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3-kube-api-access-mv7tk" (OuterVolumeSpecName: "kube-api-access-mv7tk") pod "f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3" (UID: "f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3"). InnerVolumeSpecName "kube-api-access-mv7tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:44 crc kubenswrapper[4936]: I0930 13:58:44.855483 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wb9vk" Sep 30 13:58:44 crc kubenswrapper[4936]: I0930 13:58:44.861393 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-d8pq5" Sep 30 13:58:44 crc kubenswrapper[4936]: I0930 13:58:44.879862 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv7tk\" (UniqueName: \"kubernetes.io/projected/f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3-kube-api-access-mv7tk\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:44 crc kubenswrapper[4936]: I0930 13:58:44.980925 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5slb\" (UniqueName: \"kubernetes.io/projected/4c5e56b1-fb0d-411b-b8dc-3fe9605ac342-kube-api-access-d5slb\") pod \"4c5e56b1-fb0d-411b-b8dc-3fe9605ac342\" (UID: \"4c5e56b1-fb0d-411b-b8dc-3fe9605ac342\") " Sep 30 13:58:44 crc kubenswrapper[4936]: I0930 13:58:44.981165 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j78kk\" (UniqueName: \"kubernetes.io/projected/cbeaa979-4b15-4193-be6d-3691224ecb0c-kube-api-access-j78kk\") pod \"cbeaa979-4b15-4193-be6d-3691224ecb0c\" (UID: \"cbeaa979-4b15-4193-be6d-3691224ecb0c\") " Sep 30 13:58:45 crc kubenswrapper[4936]: I0930 13:58:45.004465 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbeaa979-4b15-4193-be6d-3691224ecb0c-kube-api-access-j78kk" (OuterVolumeSpecName: "kube-api-access-j78kk") pod "cbeaa979-4b15-4193-be6d-3691224ecb0c" (UID: "cbeaa979-4b15-4193-be6d-3691224ecb0c"). InnerVolumeSpecName "kube-api-access-j78kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:45 crc kubenswrapper[4936]: I0930 13:58:45.004535 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c5e56b1-fb0d-411b-b8dc-3fe9605ac342-kube-api-access-d5slb" (OuterVolumeSpecName: "kube-api-access-d5slb") pod "4c5e56b1-fb0d-411b-b8dc-3fe9605ac342" (UID: "4c5e56b1-fb0d-411b-b8dc-3fe9605ac342"). InnerVolumeSpecName "kube-api-access-d5slb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:45 crc kubenswrapper[4936]: I0930 13:58:45.083637 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5slb\" (UniqueName: \"kubernetes.io/projected/4c5e56b1-fb0d-411b-b8dc-3fe9605ac342-kube-api-access-d5slb\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:45 crc kubenswrapper[4936]: I0930 13:58:45.083966 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j78kk\" (UniqueName: \"kubernetes.io/projected/cbeaa979-4b15-4193-be6d-3691224ecb0c-kube-api-access-j78kk\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:45 crc kubenswrapper[4936]: I0930 13:58:45.251357 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wb9vk" event={"ID":"cbeaa979-4b15-4193-be6d-3691224ecb0c","Type":"ContainerDied","Data":"c417a05845a5b3c444df3610f7ad68723caa67a71528261dfdf4b95d269f2e49"} Sep 30 13:58:45 crc kubenswrapper[4936]: I0930 13:58:45.252490 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c417a05845a5b3c444df3610f7ad68723caa67a71528261dfdf4b95d269f2e49" Sep 30 13:58:45 crc kubenswrapper[4936]: I0930 13:58:45.251391 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wb9vk" Sep 30 13:58:45 crc kubenswrapper[4936]: I0930 13:58:45.254723 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5442165b-1df8-41c0-9118-7573222c1c27","Type":"ContainerStarted","Data":"d02dcf361baedad07ba59068ee99b89588c40ef50c0f0482548af730002427dd"} Sep 30 13:58:45 crc kubenswrapper[4936]: I0930 13:58:45.256665 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-59tv8" event={"ID":"f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3","Type":"ContainerDied","Data":"03ac8176a92db88b7173aecb59083d5cca6a9bad9980b30f7022e98fd9a15b01"} Sep 30 13:58:45 crc kubenswrapper[4936]: I0930 13:58:45.256718 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03ac8176a92db88b7173aecb59083d5cca6a9bad9980b30f7022e98fd9a15b01" Sep 30 13:58:45 crc kubenswrapper[4936]: I0930 13:58:45.256680 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-59tv8" Sep 30 13:58:45 crc kubenswrapper[4936]: I0930 13:58:45.259449 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-d8pq5" event={"ID":"4c5e56b1-fb0d-411b-b8dc-3fe9605ac342","Type":"ContainerDied","Data":"ab9ca161f374356ea291775252af7b390cfa08f5894916c39ce3c986b7a0ac34"} Sep 30 13:58:45 crc kubenswrapper[4936]: I0930 13:58:45.259486 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab9ca161f374356ea291775252af7b390cfa08f5894916c39ce3c986b7a0ac34" Sep 30 13:58:45 crc kubenswrapper[4936]: I0930 13:58:45.259615 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-d8pq5" Sep 30 13:58:46 crc kubenswrapper[4936]: I0930 13:58:46.270298 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5442165b-1df8-41c0-9118-7573222c1c27","Type":"ContainerStarted","Data":"d7434791e9fb0a14d082f4b3f76bc437072f31cebee29d9a2c27732a2a1d761d"} Sep 30 13:58:48 crc kubenswrapper[4936]: I0930 13:58:48.250520 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:58:48 crc kubenswrapper[4936]: I0930 13:58:48.250796 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:58:48 crc kubenswrapper[4936]: I0930 13:58:48.250850 4936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 13:58:48 crc kubenswrapper[4936]: I0930 13:58:48.251643 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"17d93f7a347eff1cb03a59a1226bb2a542917483154320d58d4c72a501cddc95"} pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:58:48 crc kubenswrapper[4936]: I0930 13:58:48.251703 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" containerID="cri-o://17d93f7a347eff1cb03a59a1226bb2a542917483154320d58d4c72a501cddc95" gracePeriod=600 Sep 30 13:58:48 crc kubenswrapper[4936]: I0930 13:58:48.293390 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5442165b-1df8-41c0-9118-7573222c1c27","Type":"ContainerStarted","Data":"7549986bb6cb2efa19127a22ea3e64c5601cce35de4c4103f4136ddcc3771ffc"} Sep 30 13:58:48 crc kubenswrapper[4936]: I0930 13:58:48.293648 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5442165b-1df8-41c0-9118-7573222c1c27" containerName="ceilometer-central-agent" containerID="cri-o://30c5926d839127feba6cadb17cc1dc9c4a968c79122640a8d9a323c58144fa83" gracePeriod=30 Sep 30 13:58:48 crc kubenswrapper[4936]: I0930 13:58:48.293980 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 13:58:48 crc kubenswrapper[4936]: I0930 13:58:48.294314 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5442165b-1df8-41c0-9118-7573222c1c27" containerName="proxy-httpd" containerID="cri-o://7549986bb6cb2efa19127a22ea3e64c5601cce35de4c4103f4136ddcc3771ffc" gracePeriod=30 Sep 30 13:58:48 crc kubenswrapper[4936]: I0930 13:58:48.294543 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5442165b-1df8-41c0-9118-7573222c1c27" containerName="sg-core" containerID="cri-o://d7434791e9fb0a14d082f4b3f76bc437072f31cebee29d9a2c27732a2a1d761d" gracePeriod=30 Sep 30 13:58:48 crc kubenswrapper[4936]: I0930 13:58:48.294592 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5442165b-1df8-41c0-9118-7573222c1c27" containerName="ceilometer-notification-agent" containerID="cri-o://d02dcf361baedad07ba59068ee99b89588c40ef50c0f0482548af730002427dd" gracePeriod=30 Sep 30 13:58:48 crc kubenswrapper[4936]: I0930 13:58:48.328177 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.663678564 podStartE2EDuration="6.328148463s" podCreationTimestamp="2025-09-30 13:58:42 +0000 UTC" firstStartedPulling="2025-09-30 13:58:43.180551517 +0000 UTC m=+1173.564553818" lastFinishedPulling="2025-09-30 13:58:47.845021416 +0000 UTC m=+1178.229023717" observedRunningTime="2025-09-30 13:58:48.319785044 +0000 UTC m=+1178.703787345" watchObservedRunningTime="2025-09-30 13:58:48.328148463 +0000 UTC m=+1178.712150774" Sep 30 13:58:49 crc kubenswrapper[4936]: I0930 13:58:49.303191 4936 generic.go:334] "Generic (PLEG): container finished" podID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerID="17d93f7a347eff1cb03a59a1226bb2a542917483154320d58d4c72a501cddc95" exitCode=0 Sep 30 13:58:49 crc kubenswrapper[4936]: I0930 13:58:49.303684 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerDied","Data":"17d93f7a347eff1cb03a59a1226bb2a542917483154320d58d4c72a501cddc95"} Sep 30 13:58:49 crc kubenswrapper[4936]: I0930 13:58:49.303712 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"9925ed581c77513fa67110fdb500bc3893f95ddcc97c621140a1a3e57e9f5628"} Sep 30 13:58:49 crc kubenswrapper[4936]: I0930 13:58:49.303728 4936 scope.go:117] "RemoveContainer" containerID="48ed87deccef46c180b6a2bcdda86faafafe3195aa273e064e63d95d1f7429e4" Sep 30 13:58:49 crc kubenswrapper[4936]: I0930 13:58:49.308181 4936 generic.go:334] "Generic (PLEG): container finished" podID="5442165b-1df8-41c0-9118-7573222c1c27" containerID="d7434791e9fb0a14d082f4b3f76bc437072f31cebee29d9a2c27732a2a1d761d" exitCode=2 Sep 30 13:58:49 crc kubenswrapper[4936]: I0930 13:58:49.308203 4936 generic.go:334] "Generic (PLEG): container finished" podID="5442165b-1df8-41c0-9118-7573222c1c27" containerID="d02dcf361baedad07ba59068ee99b89588c40ef50c0f0482548af730002427dd" exitCode=0 Sep 30 13:58:49 crc kubenswrapper[4936]: I0930 13:58:49.308212 4936 generic.go:334] "Generic (PLEG): container finished" podID="5442165b-1df8-41c0-9118-7573222c1c27" containerID="30c5926d839127feba6cadb17cc1dc9c4a968c79122640a8d9a323c58144fa83" exitCode=0 Sep 30 13:58:49 crc kubenswrapper[4936]: I0930 13:58:49.308226 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5442165b-1df8-41c0-9118-7573222c1c27","Type":"ContainerDied","Data":"d7434791e9fb0a14d082f4b3f76bc437072f31cebee29d9a2c27732a2a1d761d"} Sep 30 13:58:49 crc kubenswrapper[4936]: I0930 13:58:49.308243 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5442165b-1df8-41c0-9118-7573222c1c27","Type":"ContainerDied","Data":"d02dcf361baedad07ba59068ee99b89588c40ef50c0f0482548af730002427dd"} Sep 30 13:58:49 crc kubenswrapper[4936]: I0930 13:58:49.308252 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5442165b-1df8-41c0-9118-7573222c1c27","Type":"ContainerDied","Data":"30c5926d839127feba6cadb17cc1dc9c4a968c79122640a8d9a323c58144fa83"} Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.162429 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-4628-account-create-jdb2s"] Sep 30 13:58:51 crc kubenswrapper[4936]: E0930 13:58:51.163299 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbeaa979-4b15-4193-be6d-3691224ecb0c" containerName="mariadb-database-create" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.163314 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbeaa979-4b15-4193-be6d-3691224ecb0c" containerName="mariadb-database-create" Sep 30 13:58:51 crc kubenswrapper[4936]: E0930 13:58:51.163361 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3" containerName="mariadb-database-create" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.163370 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3" containerName="mariadb-database-create" Sep 30 13:58:51 crc kubenswrapper[4936]: E0930 13:58:51.163381 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c5e56b1-fb0d-411b-b8dc-3fe9605ac342" containerName="mariadb-database-create" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.163388 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c5e56b1-fb0d-411b-b8dc-3fe9605ac342" containerName="mariadb-database-create" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.163564 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c5e56b1-fb0d-411b-b8dc-3fe9605ac342" containerName="mariadb-database-create" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.163579 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbeaa979-4b15-4193-be6d-3691224ecb0c" containerName="mariadb-database-create" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.163601 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3" containerName="mariadb-database-create" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.164276 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4628-account-create-jdb2s" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.171226 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.192089 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4628-account-create-jdb2s"] Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.297069 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wsbg\" (UniqueName: \"kubernetes.io/projected/5e87f5ac-ee83-4ba4-b456-f378b93edd80-kube-api-access-9wsbg\") pod \"nova-api-4628-account-create-jdb2s\" (UID: \"5e87f5ac-ee83-4ba4-b456-f378b93edd80\") " pod="openstack/nova-api-4628-account-create-jdb2s" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.364801 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-05e7-account-create-z5bwm"] Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.366213 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-05e7-account-create-z5bwm" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.367965 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.380039 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-05e7-account-create-z5bwm"] Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.400655 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wsbg\" (UniqueName: \"kubernetes.io/projected/5e87f5ac-ee83-4ba4-b456-f378b93edd80-kube-api-access-9wsbg\") pod \"nova-api-4628-account-create-jdb2s\" (UID: \"5e87f5ac-ee83-4ba4-b456-f378b93edd80\") " pod="openstack/nova-api-4628-account-create-jdb2s" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.434731 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wsbg\" (UniqueName: \"kubernetes.io/projected/5e87f5ac-ee83-4ba4-b456-f378b93edd80-kube-api-access-9wsbg\") pod \"nova-api-4628-account-create-jdb2s\" (UID: \"5e87f5ac-ee83-4ba4-b456-f378b93edd80\") " pod="openstack/nova-api-4628-account-create-jdb2s" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.502226 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rktzq\" (UniqueName: \"kubernetes.io/projected/1be0320c-fe8d-41de-bcf6-97dc4089ed39-kube-api-access-rktzq\") pod \"nova-cell0-05e7-account-create-z5bwm\" (UID: \"1be0320c-fe8d-41de-bcf6-97dc4089ed39\") " pod="openstack/nova-cell0-05e7-account-create-z5bwm" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.504398 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4628-account-create-jdb2s" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.585247 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-5b87-account-create-zpj8s"] Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.596846 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-5b87-account-create-zpj8s"] Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.596965 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5b87-account-create-zpj8s" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.599435 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.610726 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rktzq\" (UniqueName: \"kubernetes.io/projected/1be0320c-fe8d-41de-bcf6-97dc4089ed39-kube-api-access-rktzq\") pod \"nova-cell0-05e7-account-create-z5bwm\" (UID: \"1be0320c-fe8d-41de-bcf6-97dc4089ed39\") " pod="openstack/nova-cell0-05e7-account-create-z5bwm" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.636347 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rktzq\" (UniqueName: \"kubernetes.io/projected/1be0320c-fe8d-41de-bcf6-97dc4089ed39-kube-api-access-rktzq\") pod \"nova-cell0-05e7-account-create-z5bwm\" (UID: \"1be0320c-fe8d-41de-bcf6-97dc4089ed39\") " pod="openstack/nova-cell0-05e7-account-create-z5bwm" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.685932 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-05e7-account-create-z5bwm" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.714638 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjfgd\" (UniqueName: \"kubernetes.io/projected/90caccbb-1dbf-4ba6-af11-3b21f95535dc-kube-api-access-kjfgd\") pod \"nova-cell1-5b87-account-create-zpj8s\" (UID: \"90caccbb-1dbf-4ba6-af11-3b21f95535dc\") " pod="openstack/nova-cell1-5b87-account-create-zpj8s" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.816286 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjfgd\" (UniqueName: \"kubernetes.io/projected/90caccbb-1dbf-4ba6-af11-3b21f95535dc-kube-api-access-kjfgd\") pod \"nova-cell1-5b87-account-create-zpj8s\" (UID: \"90caccbb-1dbf-4ba6-af11-3b21f95535dc\") " pod="openstack/nova-cell1-5b87-account-create-zpj8s" Sep 30 13:58:51 crc kubenswrapper[4936]: I0930 13:58:51.862371 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjfgd\" (UniqueName: \"kubernetes.io/projected/90caccbb-1dbf-4ba6-af11-3b21f95535dc-kube-api-access-kjfgd\") pod \"nova-cell1-5b87-account-create-zpj8s\" (UID: \"90caccbb-1dbf-4ba6-af11-3b21f95535dc\") " pod="openstack/nova-cell1-5b87-account-create-zpj8s" Sep 30 13:58:52 crc kubenswrapper[4936]: I0930 13:58:52.083731 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Sep 30 13:58:52 crc kubenswrapper[4936]: I0930 13:58:52.096859 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5b87-account-create-zpj8s" Sep 30 13:58:52 crc kubenswrapper[4936]: I0930 13:58:52.226853 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4628-account-create-jdb2s"] Sep 30 13:58:52 crc kubenswrapper[4936]: I0930 13:58:52.329867 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-05e7-account-create-z5bwm"] Sep 30 13:58:52 crc kubenswrapper[4936]: W0930 13:58:52.352057 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1be0320c_fe8d_41de_bcf6_97dc4089ed39.slice/crio-25a0da78f3d6af2416e57f611c106989be6c50bff5caa54a29d4407772c15b6c WatchSource:0}: Error finding container 25a0da78f3d6af2416e57f611c106989be6c50bff5caa54a29d4407772c15b6c: Status 404 returned error can't find the container with id 25a0da78f3d6af2416e57f611c106989be6c50bff5caa54a29d4407772c15b6c Sep 30 13:58:52 crc kubenswrapper[4936]: I0930 13:58:52.372480 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4628-account-create-jdb2s" event={"ID":"5e87f5ac-ee83-4ba4-b456-f378b93edd80","Type":"ContainerStarted","Data":"472d5c47600431ed68c5395f389d0ebb6ce6ac6a473b51efb0b1c59a53705f8a"} Sep 30 13:58:52 crc kubenswrapper[4936]: I0930 13:58:52.714514 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-5b87-account-create-zpj8s"] Sep 30 13:58:53 crc kubenswrapper[4936]: I0930 13:58:53.382730 4936 generic.go:334] "Generic (PLEG): container finished" podID="90caccbb-1dbf-4ba6-af11-3b21f95535dc" containerID="c994cfd6b7cc5e54c3b2e32b3e0a5a298013d72287be64e7aa3079d59492e5c2" exitCode=0 Sep 30 13:58:53 crc kubenswrapper[4936]: I0930 13:58:53.382846 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5b87-account-create-zpj8s" event={"ID":"90caccbb-1dbf-4ba6-af11-3b21f95535dc","Type":"ContainerDied","Data":"c994cfd6b7cc5e54c3b2e32b3e0a5a298013d72287be64e7aa3079d59492e5c2"} Sep 30 13:58:53 crc kubenswrapper[4936]: I0930 13:58:53.382891 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5b87-account-create-zpj8s" event={"ID":"90caccbb-1dbf-4ba6-af11-3b21f95535dc","Type":"ContainerStarted","Data":"fd70676b7f5f5c2180b64670c73aea9789cd7090a9107ff5eaf470e9aa3f45bd"} Sep 30 13:58:53 crc kubenswrapper[4936]: I0930 13:58:53.384863 4936 generic.go:334] "Generic (PLEG): container finished" podID="1be0320c-fe8d-41de-bcf6-97dc4089ed39" containerID="d3bc471e5264a1d1e113b058ffbfc9c96f2b1748aa818ffac0207b57aad2b48d" exitCode=0 Sep 30 13:58:53 crc kubenswrapper[4936]: I0930 13:58:53.384920 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-05e7-account-create-z5bwm" event={"ID":"1be0320c-fe8d-41de-bcf6-97dc4089ed39","Type":"ContainerDied","Data":"d3bc471e5264a1d1e113b058ffbfc9c96f2b1748aa818ffac0207b57aad2b48d"} Sep 30 13:58:53 crc kubenswrapper[4936]: I0930 13:58:53.384938 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-05e7-account-create-z5bwm" event={"ID":"1be0320c-fe8d-41de-bcf6-97dc4089ed39","Type":"ContainerStarted","Data":"25a0da78f3d6af2416e57f611c106989be6c50bff5caa54a29d4407772c15b6c"} Sep 30 13:58:53 crc kubenswrapper[4936]: I0930 13:58:53.388219 4936 generic.go:334] "Generic (PLEG): container finished" podID="5e87f5ac-ee83-4ba4-b456-f378b93edd80" containerID="f18ba22d77c1eda91097c3c5d1ec71d70943e850aa81b62c66f87da7dbd174b1" exitCode=0 Sep 30 13:58:53 crc kubenswrapper[4936]: I0930 13:58:53.388244 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4628-account-create-jdb2s" event={"ID":"5e87f5ac-ee83-4ba4-b456-f378b93edd80","Type":"ContainerDied","Data":"f18ba22d77c1eda91097c3c5d1ec71d70943e850aa81b62c66f87da7dbd174b1"} Sep 30 13:58:54 crc kubenswrapper[4936]: I0930 13:58:54.812220 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5b87-account-create-zpj8s" Sep 30 13:58:54 crc kubenswrapper[4936]: I0930 13:58:54.927415 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-05e7-account-create-z5bwm" Sep 30 13:58:54 crc kubenswrapper[4936]: I0930 13:58:54.945053 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4628-account-create-jdb2s" Sep 30 13:58:55 crc kubenswrapper[4936]: I0930 13:58:55.001327 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjfgd\" (UniqueName: \"kubernetes.io/projected/90caccbb-1dbf-4ba6-af11-3b21f95535dc-kube-api-access-kjfgd\") pod \"90caccbb-1dbf-4ba6-af11-3b21f95535dc\" (UID: \"90caccbb-1dbf-4ba6-af11-3b21f95535dc\") " Sep 30 13:58:55 crc kubenswrapper[4936]: I0930 13:58:55.017130 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90caccbb-1dbf-4ba6-af11-3b21f95535dc-kube-api-access-kjfgd" (OuterVolumeSpecName: "kube-api-access-kjfgd") pod "90caccbb-1dbf-4ba6-af11-3b21f95535dc" (UID: "90caccbb-1dbf-4ba6-af11-3b21f95535dc"). InnerVolumeSpecName "kube-api-access-kjfgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:55 crc kubenswrapper[4936]: I0930 13:58:55.103810 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wsbg\" (UniqueName: \"kubernetes.io/projected/5e87f5ac-ee83-4ba4-b456-f378b93edd80-kube-api-access-9wsbg\") pod \"5e87f5ac-ee83-4ba4-b456-f378b93edd80\" (UID: \"5e87f5ac-ee83-4ba4-b456-f378b93edd80\") " Sep 30 13:58:55 crc kubenswrapper[4936]: I0930 13:58:55.104174 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rktzq\" (UniqueName: \"kubernetes.io/projected/1be0320c-fe8d-41de-bcf6-97dc4089ed39-kube-api-access-rktzq\") pod \"1be0320c-fe8d-41de-bcf6-97dc4089ed39\" (UID: \"1be0320c-fe8d-41de-bcf6-97dc4089ed39\") " Sep 30 13:58:55 crc kubenswrapper[4936]: I0930 13:58:55.106069 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjfgd\" (UniqueName: \"kubernetes.io/projected/90caccbb-1dbf-4ba6-af11-3b21f95535dc-kube-api-access-kjfgd\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:55 crc kubenswrapper[4936]: I0930 13:58:55.107510 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e87f5ac-ee83-4ba4-b456-f378b93edd80-kube-api-access-9wsbg" (OuterVolumeSpecName: "kube-api-access-9wsbg") pod "5e87f5ac-ee83-4ba4-b456-f378b93edd80" (UID: "5e87f5ac-ee83-4ba4-b456-f378b93edd80"). InnerVolumeSpecName "kube-api-access-9wsbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:55 crc kubenswrapper[4936]: I0930 13:58:55.109502 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1be0320c-fe8d-41de-bcf6-97dc4089ed39-kube-api-access-rktzq" (OuterVolumeSpecName: "kube-api-access-rktzq") pod "1be0320c-fe8d-41de-bcf6-97dc4089ed39" (UID: "1be0320c-fe8d-41de-bcf6-97dc4089ed39"). InnerVolumeSpecName "kube-api-access-rktzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:55 crc kubenswrapper[4936]: I0930 13:58:55.207596 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rktzq\" (UniqueName: \"kubernetes.io/projected/1be0320c-fe8d-41de-bcf6-97dc4089ed39-kube-api-access-rktzq\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:55 crc kubenswrapper[4936]: I0930 13:58:55.207637 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wsbg\" (UniqueName: \"kubernetes.io/projected/5e87f5ac-ee83-4ba4-b456-f378b93edd80-kube-api-access-9wsbg\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:55 crc kubenswrapper[4936]: I0930 13:58:55.408007 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5b87-account-create-zpj8s" event={"ID":"90caccbb-1dbf-4ba6-af11-3b21f95535dc","Type":"ContainerDied","Data":"fd70676b7f5f5c2180b64670c73aea9789cd7090a9107ff5eaf470e9aa3f45bd"} Sep 30 13:58:55 crc kubenswrapper[4936]: I0930 13:58:55.408059 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd70676b7f5f5c2180b64670c73aea9789cd7090a9107ff5eaf470e9aa3f45bd" Sep 30 13:58:55 crc kubenswrapper[4936]: I0930 13:58:55.408589 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5b87-account-create-zpj8s" Sep 30 13:58:55 crc kubenswrapper[4936]: I0930 13:58:55.409367 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-05e7-account-create-z5bwm" event={"ID":"1be0320c-fe8d-41de-bcf6-97dc4089ed39","Type":"ContainerDied","Data":"25a0da78f3d6af2416e57f611c106989be6c50bff5caa54a29d4407772c15b6c"} Sep 30 13:58:55 crc kubenswrapper[4936]: I0930 13:58:55.409496 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25a0da78f3d6af2416e57f611c106989be6c50bff5caa54a29d4407772c15b6c" Sep 30 13:58:55 crc kubenswrapper[4936]: I0930 13:58:55.409533 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-05e7-account-create-z5bwm" Sep 30 13:58:55 crc kubenswrapper[4936]: I0930 13:58:55.411191 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4628-account-create-jdb2s" event={"ID":"5e87f5ac-ee83-4ba4-b456-f378b93edd80","Type":"ContainerDied","Data":"472d5c47600431ed68c5395f389d0ebb6ce6ac6a473b51efb0b1c59a53705f8a"} Sep 30 13:58:55 crc kubenswrapper[4936]: I0930 13:58:55.411228 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="472d5c47600431ed68c5395f389d0ebb6ce6ac6a473b51efb0b1c59a53705f8a" Sep 30 13:58:55 crc kubenswrapper[4936]: I0930 13:58:55.411250 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4628-account-create-jdb2s" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.663523 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5xd69"] Sep 30 13:58:56 crc kubenswrapper[4936]: E0930 13:58:56.664281 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e87f5ac-ee83-4ba4-b456-f378b93edd80" containerName="mariadb-account-create" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.664301 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e87f5ac-ee83-4ba4-b456-f378b93edd80" containerName="mariadb-account-create" Sep 30 13:58:56 crc kubenswrapper[4936]: E0930 13:58:56.664311 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1be0320c-fe8d-41de-bcf6-97dc4089ed39" containerName="mariadb-account-create" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.664319 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be0320c-fe8d-41de-bcf6-97dc4089ed39" containerName="mariadb-account-create" Sep 30 13:58:56 crc kubenswrapper[4936]: E0930 13:58:56.664365 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90caccbb-1dbf-4ba6-af11-3b21f95535dc" containerName="mariadb-account-create" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.664375 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="90caccbb-1dbf-4ba6-af11-3b21f95535dc" containerName="mariadb-account-create" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.664606 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e87f5ac-ee83-4ba4-b456-f378b93edd80" containerName="mariadb-account-create" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.664630 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="90caccbb-1dbf-4ba6-af11-3b21f95535dc" containerName="mariadb-account-create" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.664651 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="1be0320c-fe8d-41de-bcf6-97dc4089ed39" containerName="mariadb-account-create" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.665433 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5xd69" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.671891 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.672410 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.673214 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-cvpt4" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.683085 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5xd69"] Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.833129 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22fda54e-412b-4057-b92b-6b4eb2cde369-scripts\") pod \"nova-cell0-conductor-db-sync-5xd69\" (UID: \"22fda54e-412b-4057-b92b-6b4eb2cde369\") " pod="openstack/nova-cell0-conductor-db-sync-5xd69" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.833245 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8dd5\" (UniqueName: \"kubernetes.io/projected/22fda54e-412b-4057-b92b-6b4eb2cde369-kube-api-access-f8dd5\") pod \"nova-cell0-conductor-db-sync-5xd69\" (UID: \"22fda54e-412b-4057-b92b-6b4eb2cde369\") " pod="openstack/nova-cell0-conductor-db-sync-5xd69" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.833289 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22fda54e-412b-4057-b92b-6b4eb2cde369-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5xd69\" (UID: \"22fda54e-412b-4057-b92b-6b4eb2cde369\") " pod="openstack/nova-cell0-conductor-db-sync-5xd69" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.833536 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22fda54e-412b-4057-b92b-6b4eb2cde369-config-data\") pod \"nova-cell0-conductor-db-sync-5xd69\" (UID: \"22fda54e-412b-4057-b92b-6b4eb2cde369\") " pod="openstack/nova-cell0-conductor-db-sync-5xd69" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.935544 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8dd5\" (UniqueName: \"kubernetes.io/projected/22fda54e-412b-4057-b92b-6b4eb2cde369-kube-api-access-f8dd5\") pod \"nova-cell0-conductor-db-sync-5xd69\" (UID: \"22fda54e-412b-4057-b92b-6b4eb2cde369\") " pod="openstack/nova-cell0-conductor-db-sync-5xd69" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.935640 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22fda54e-412b-4057-b92b-6b4eb2cde369-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5xd69\" (UID: \"22fda54e-412b-4057-b92b-6b4eb2cde369\") " pod="openstack/nova-cell0-conductor-db-sync-5xd69" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.935790 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22fda54e-412b-4057-b92b-6b4eb2cde369-config-data\") pod \"nova-cell0-conductor-db-sync-5xd69\" (UID: \"22fda54e-412b-4057-b92b-6b4eb2cde369\") " pod="openstack/nova-cell0-conductor-db-sync-5xd69" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.935846 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22fda54e-412b-4057-b92b-6b4eb2cde369-scripts\") pod \"nova-cell0-conductor-db-sync-5xd69\" (UID: \"22fda54e-412b-4057-b92b-6b4eb2cde369\") " pod="openstack/nova-cell0-conductor-db-sync-5xd69" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.952652 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22fda54e-412b-4057-b92b-6b4eb2cde369-config-data\") pod \"nova-cell0-conductor-db-sync-5xd69\" (UID: \"22fda54e-412b-4057-b92b-6b4eb2cde369\") " pod="openstack/nova-cell0-conductor-db-sync-5xd69" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.952753 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22fda54e-412b-4057-b92b-6b4eb2cde369-scripts\") pod \"nova-cell0-conductor-db-sync-5xd69\" (UID: \"22fda54e-412b-4057-b92b-6b4eb2cde369\") " pod="openstack/nova-cell0-conductor-db-sync-5xd69" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.953191 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22fda54e-412b-4057-b92b-6b4eb2cde369-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5xd69\" (UID: \"22fda54e-412b-4057-b92b-6b4eb2cde369\") " pod="openstack/nova-cell0-conductor-db-sync-5xd69" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.956658 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8dd5\" (UniqueName: \"kubernetes.io/projected/22fda54e-412b-4057-b92b-6b4eb2cde369-kube-api-access-f8dd5\") pod \"nova-cell0-conductor-db-sync-5xd69\" (UID: \"22fda54e-412b-4057-b92b-6b4eb2cde369\") " pod="openstack/nova-cell0-conductor-db-sync-5xd69" Sep 30 13:58:56 crc kubenswrapper[4936]: I0930 13:58:56.983292 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5xd69" Sep 30 13:58:57 crc kubenswrapper[4936]: I0930 13:58:57.494625 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5xd69"] Sep 30 13:58:58 crc kubenswrapper[4936]: I0930 13:58:58.451571 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5xd69" event={"ID":"22fda54e-412b-4057-b92b-6b4eb2cde369","Type":"ContainerStarted","Data":"4376634966fc79e467f37fb06de5e18e6d623efbcbd756bde73608bd9ed9acbb"} Sep 30 13:59:06 crc kubenswrapper[4936]: I0930 13:59:06.538306 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5xd69" event={"ID":"22fda54e-412b-4057-b92b-6b4eb2cde369","Type":"ContainerStarted","Data":"e2e114ab60e53fcd60fcb67717152158283014fc519e5d2fb4f16c61af7028bd"} Sep 30 13:59:06 crc kubenswrapper[4936]: I0930 13:59:06.557945 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-5xd69" podStartSLOduration=2.3003054 podStartE2EDuration="10.557927884s" podCreationTimestamp="2025-09-30 13:58:56 +0000 UTC" firstStartedPulling="2025-09-30 13:58:57.498518019 +0000 UTC m=+1187.882520320" lastFinishedPulling="2025-09-30 13:59:05.756140513 +0000 UTC m=+1196.140142804" observedRunningTime="2025-09-30 13:59:06.556809534 +0000 UTC m=+1196.940811845" watchObservedRunningTime="2025-09-30 13:59:06.557927884 +0000 UTC m=+1196.941930185" Sep 30 13:59:12 crc kubenswrapper[4936]: I0930 13:59:12.579693 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="5442165b-1df8-41c0-9118-7573222c1c27" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Sep 30 13:59:18 crc kubenswrapper[4936]: I0930 13:59:18.632900 4936 generic.go:334] "Generic (PLEG): container finished" podID="22fda54e-412b-4057-b92b-6b4eb2cde369" containerID="e2e114ab60e53fcd60fcb67717152158283014fc519e5d2fb4f16c61af7028bd" exitCode=0 Sep 30 13:59:18 crc kubenswrapper[4936]: I0930 13:59:18.632999 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5xd69" event={"ID":"22fda54e-412b-4057-b92b-6b4eb2cde369","Type":"ContainerDied","Data":"e2e114ab60e53fcd60fcb67717152158283014fc519e5d2fb4f16c61af7028bd"} Sep 30 13:59:18 crc kubenswrapper[4936]: I0930 13:59:18.639980 4936 generic.go:334] "Generic (PLEG): container finished" podID="5442165b-1df8-41c0-9118-7573222c1c27" containerID="7549986bb6cb2efa19127a22ea3e64c5601cce35de4c4103f4136ddcc3771ffc" exitCode=137 Sep 30 13:59:18 crc kubenswrapper[4936]: I0930 13:59:18.640180 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5442165b-1df8-41c0-9118-7573222c1c27","Type":"ContainerDied","Data":"7549986bb6cb2efa19127a22ea3e64c5601cce35de4c4103f4136ddcc3771ffc"} Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.489973 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.648219 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5442165b-1df8-41c0-9118-7573222c1c27-run-httpd\") pod \"5442165b-1df8-41c0-9118-7573222c1c27\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.648321 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbw7z\" (UniqueName: \"kubernetes.io/projected/5442165b-1df8-41c0-9118-7573222c1c27-kube-api-access-lbw7z\") pod \"5442165b-1df8-41c0-9118-7573222c1c27\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.648401 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-config-data\") pod \"5442165b-1df8-41c0-9118-7573222c1c27\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.648439 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5442165b-1df8-41c0-9118-7573222c1c27-log-httpd\") pod \"5442165b-1df8-41c0-9118-7573222c1c27\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.648498 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-scripts\") pod \"5442165b-1df8-41c0-9118-7573222c1c27\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.648572 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-sg-core-conf-yaml\") pod \"5442165b-1df8-41c0-9118-7573222c1c27\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.648638 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-combined-ca-bundle\") pod \"5442165b-1df8-41c0-9118-7573222c1c27\" (UID: \"5442165b-1df8-41c0-9118-7573222c1c27\") " Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.649023 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5442165b-1df8-41c0-9118-7573222c1c27-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5442165b-1df8-41c0-9118-7573222c1c27" (UID: "5442165b-1df8-41c0-9118-7573222c1c27"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.649782 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5442165b-1df8-41c0-9118-7573222c1c27-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5442165b-1df8-41c0-9118-7573222c1c27" (UID: "5442165b-1df8-41c0-9118-7573222c1c27"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.650095 4936 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5442165b-1df8-41c0-9118-7573222c1c27-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.650111 4936 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5442165b-1df8-41c0-9118-7573222c1c27-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.654180 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-scripts" (OuterVolumeSpecName: "scripts") pod "5442165b-1df8-41c0-9118-7573222c1c27" (UID: "5442165b-1df8-41c0-9118-7573222c1c27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.654791 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5442165b-1df8-41c0-9118-7573222c1c27-kube-api-access-lbw7z" (OuterVolumeSpecName: "kube-api-access-lbw7z") pod "5442165b-1df8-41c0-9118-7573222c1c27" (UID: "5442165b-1df8-41c0-9118-7573222c1c27"). InnerVolumeSpecName "kube-api-access-lbw7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.662165 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.662805 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5442165b-1df8-41c0-9118-7573222c1c27","Type":"ContainerDied","Data":"381b25a2c42774afd79067fed416fbd2f6274f6ce09a774287af059b1b35ddc2"} Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.662844 4936 scope.go:117] "RemoveContainer" containerID="7549986bb6cb2efa19127a22ea3e64c5601cce35de4c4103f4136ddcc3771ffc" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.687972 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5442165b-1df8-41c0-9118-7573222c1c27" (UID: "5442165b-1df8-41c0-9118-7573222c1c27"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.728520 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5442165b-1df8-41c0-9118-7573222c1c27" (UID: "5442165b-1df8-41c0-9118-7573222c1c27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.745463 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-config-data" (OuterVolumeSpecName: "config-data") pod "5442165b-1df8-41c0-9118-7573222c1c27" (UID: "5442165b-1df8-41c0-9118-7573222c1c27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.752204 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.752224 4936 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.752236 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.752295 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbw7z\" (UniqueName: \"kubernetes.io/projected/5442165b-1df8-41c0-9118-7573222c1c27-kube-api-access-lbw7z\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.752307 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5442165b-1df8-41c0-9118-7573222c1c27-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.820424 4936 scope.go:117] "RemoveContainer" containerID="d7434791e9fb0a14d082f4b3f76bc437072f31cebee29d9a2c27732a2a1d761d" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.837511 4936 scope.go:117] "RemoveContainer" containerID="d02dcf361baedad07ba59068ee99b89588c40ef50c0f0482548af730002427dd" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.857041 4936 scope.go:117] "RemoveContainer" containerID="30c5926d839127feba6cadb17cc1dc9c4a968c79122640a8d9a323c58144fa83" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.909577 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5xd69" Sep 30 13:59:19 crc kubenswrapper[4936]: I0930 13:59:19.995374 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.003801 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.018535 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:59:20 crc kubenswrapper[4936]: E0930 13:59:20.019448 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5442165b-1df8-41c0-9118-7573222c1c27" containerName="ceilometer-central-agent" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.019540 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5442165b-1df8-41c0-9118-7573222c1c27" containerName="ceilometer-central-agent" Sep 30 13:59:20 crc kubenswrapper[4936]: E0930 13:59:20.019620 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5442165b-1df8-41c0-9118-7573222c1c27" containerName="sg-core" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.019694 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5442165b-1df8-41c0-9118-7573222c1c27" containerName="sg-core" Sep 30 13:59:20 crc kubenswrapper[4936]: E0930 13:59:20.019771 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5442165b-1df8-41c0-9118-7573222c1c27" containerName="proxy-httpd" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.019876 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5442165b-1df8-41c0-9118-7573222c1c27" containerName="proxy-httpd" Sep 30 13:59:20 crc kubenswrapper[4936]: E0930 13:59:20.019972 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22fda54e-412b-4057-b92b-6b4eb2cde369" containerName="nova-cell0-conductor-db-sync" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.020054 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="22fda54e-412b-4057-b92b-6b4eb2cde369" containerName="nova-cell0-conductor-db-sync" Sep 30 13:59:20 crc kubenswrapper[4936]: E0930 13:59:20.020132 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5442165b-1df8-41c0-9118-7573222c1c27" containerName="ceilometer-notification-agent" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.020204 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5442165b-1df8-41c0-9118-7573222c1c27" containerName="ceilometer-notification-agent" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.020525 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="22fda54e-412b-4057-b92b-6b4eb2cde369" containerName="nova-cell0-conductor-db-sync" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.020628 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5442165b-1df8-41c0-9118-7573222c1c27" containerName="sg-core" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.020766 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5442165b-1df8-41c0-9118-7573222c1c27" containerName="ceilometer-notification-agent" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.020855 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5442165b-1df8-41c0-9118-7573222c1c27" containerName="proxy-httpd" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.020940 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5442165b-1df8-41c0-9118-7573222c1c27" containerName="ceilometer-central-agent" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.023079 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.027771 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.027863 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.035286 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.062524 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22fda54e-412b-4057-b92b-6b4eb2cde369-scripts\") pod \"22fda54e-412b-4057-b92b-6b4eb2cde369\" (UID: \"22fda54e-412b-4057-b92b-6b4eb2cde369\") " Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.062698 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22fda54e-412b-4057-b92b-6b4eb2cde369-combined-ca-bundle\") pod \"22fda54e-412b-4057-b92b-6b4eb2cde369\" (UID: \"22fda54e-412b-4057-b92b-6b4eb2cde369\") " Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.062754 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22fda54e-412b-4057-b92b-6b4eb2cde369-config-data\") pod \"22fda54e-412b-4057-b92b-6b4eb2cde369\" (UID: \"22fda54e-412b-4057-b92b-6b4eb2cde369\") " Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.062856 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8dd5\" (UniqueName: \"kubernetes.io/projected/22fda54e-412b-4057-b92b-6b4eb2cde369-kube-api-access-f8dd5\") pod \"22fda54e-412b-4057-b92b-6b4eb2cde369\" (UID: \"22fda54e-412b-4057-b92b-6b4eb2cde369\") " Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.068701 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22fda54e-412b-4057-b92b-6b4eb2cde369-kube-api-access-f8dd5" (OuterVolumeSpecName: "kube-api-access-f8dd5") pod "22fda54e-412b-4057-b92b-6b4eb2cde369" (UID: "22fda54e-412b-4057-b92b-6b4eb2cde369"). InnerVolumeSpecName "kube-api-access-f8dd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.068888 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22fda54e-412b-4057-b92b-6b4eb2cde369-scripts" (OuterVolumeSpecName: "scripts") pod "22fda54e-412b-4057-b92b-6b4eb2cde369" (UID: "22fda54e-412b-4057-b92b-6b4eb2cde369"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.089214 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22fda54e-412b-4057-b92b-6b4eb2cde369-config-data" (OuterVolumeSpecName: "config-data") pod "22fda54e-412b-4057-b92b-6b4eb2cde369" (UID: "22fda54e-412b-4057-b92b-6b4eb2cde369"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.091120 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22fda54e-412b-4057-b92b-6b4eb2cde369-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22fda54e-412b-4057-b92b-6b4eb2cde369" (UID: "22fda54e-412b-4057-b92b-6b4eb2cde369"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.164988 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-config-data\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.165043 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkvpn\" (UniqueName: \"kubernetes.io/projected/50b2b4d1-70a1-4044-b585-721988a65bda-kube-api-access-rkvpn\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.165206 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50b2b4d1-70a1-4044-b585-721988a65bda-log-httpd\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.165292 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50b2b4d1-70a1-4044-b585-721988a65bda-run-httpd\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.165329 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-scripts\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.165525 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.165679 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.165789 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8dd5\" (UniqueName: \"kubernetes.io/projected/22fda54e-412b-4057-b92b-6b4eb2cde369-kube-api-access-f8dd5\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.165804 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22fda54e-412b-4057-b92b-6b4eb2cde369-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.165815 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22fda54e-412b-4057-b92b-6b4eb2cde369-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.165825 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22fda54e-412b-4057-b92b-6b4eb2cde369-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.267162 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.267242 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.267280 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-config-data\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.267300 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkvpn\" (UniqueName: \"kubernetes.io/projected/50b2b4d1-70a1-4044-b585-721988a65bda-kube-api-access-rkvpn\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.267362 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50b2b4d1-70a1-4044-b585-721988a65bda-log-httpd\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.267389 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50b2b4d1-70a1-4044-b585-721988a65bda-run-httpd\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.267409 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-scripts\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.267854 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50b2b4d1-70a1-4044-b585-721988a65bda-log-httpd\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.268093 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50b2b4d1-70a1-4044-b585-721988a65bda-run-httpd\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.271304 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.272525 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.272734 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-config-data\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.275921 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-scripts\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.286664 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkvpn\" (UniqueName: \"kubernetes.io/projected/50b2b4d1-70a1-4044-b585-721988a65bda-kube-api-access-rkvpn\") pod \"ceilometer-0\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.325821 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5442165b-1df8-41c0-9118-7573222c1c27" path="/var/lib/kubelet/pods/5442165b-1df8-41c0-9118-7573222c1c27/volumes" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.343966 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.582910 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.672692 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50b2b4d1-70a1-4044-b585-721988a65bda","Type":"ContainerStarted","Data":"9571276b8c2dcf819213bca09fce0af7dc8dced19f37eb5c1f39884a3beb3f7a"} Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.678734 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5xd69" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.678812 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5xd69" event={"ID":"22fda54e-412b-4057-b92b-6b4eb2cde369","Type":"ContainerDied","Data":"4376634966fc79e467f37fb06de5e18e6d623efbcbd756bde73608bd9ed9acbb"} Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.678944 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4376634966fc79e467f37fb06de5e18e6d623efbcbd756bde73608bd9ed9acbb" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.821055 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.822777 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.829031 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-cvpt4" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.829436 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.838831 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.997490 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae\") " pod="openstack/nova-cell0-conductor-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.997587 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae\") " pod="openstack/nova-cell0-conductor-0" Sep 30 13:59:20 crc kubenswrapper[4936]: I0930 13:59:20.997695 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67kkd\" (UniqueName: \"kubernetes.io/projected/a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae-kube-api-access-67kkd\") pod \"nova-cell0-conductor-0\" (UID: \"a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae\") " pod="openstack/nova-cell0-conductor-0" Sep 30 13:59:21 crc kubenswrapper[4936]: I0930 13:59:21.099662 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67kkd\" (UniqueName: \"kubernetes.io/projected/a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae-kube-api-access-67kkd\") pod \"nova-cell0-conductor-0\" (UID: \"a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae\") " pod="openstack/nova-cell0-conductor-0" Sep 30 13:59:21 crc kubenswrapper[4936]: I0930 13:59:21.100057 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae\") " pod="openstack/nova-cell0-conductor-0" Sep 30 13:59:21 crc kubenswrapper[4936]: I0930 13:59:21.100150 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae\") " pod="openstack/nova-cell0-conductor-0" Sep 30 13:59:21 crc kubenswrapper[4936]: I0930 13:59:21.109580 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae\") " pod="openstack/nova-cell0-conductor-0" Sep 30 13:59:21 crc kubenswrapper[4936]: I0930 13:59:21.110940 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae\") " pod="openstack/nova-cell0-conductor-0" Sep 30 13:59:21 crc kubenswrapper[4936]: I0930 13:59:21.118409 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67kkd\" (UniqueName: \"kubernetes.io/projected/a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae-kube-api-access-67kkd\") pod \"nova-cell0-conductor-0\" (UID: \"a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae\") " pod="openstack/nova-cell0-conductor-0" Sep 30 13:59:21 crc kubenswrapper[4936]: I0930 13:59:21.141231 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 13:59:21 crc kubenswrapper[4936]: I0930 13:59:21.565100 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 13:59:21 crc kubenswrapper[4936]: W0930 13:59:21.570092 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9dca4ae_4150_4d74_9ce1_bd3a5fbd45ae.slice/crio-3a8204d3c0a9a38cd25918e2de12f1551e072bafd3039741d55bc57064d74aa7 WatchSource:0}: Error finding container 3a8204d3c0a9a38cd25918e2de12f1551e072bafd3039741d55bc57064d74aa7: Status 404 returned error can't find the container with id 3a8204d3c0a9a38cd25918e2de12f1551e072bafd3039741d55bc57064d74aa7 Sep 30 13:59:21 crc kubenswrapper[4936]: I0930 13:59:21.699130 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50b2b4d1-70a1-4044-b585-721988a65bda","Type":"ContainerStarted","Data":"0883237df2354bae649fbc62cccda782fa944d3b00a862d355605d00459b25f4"} Sep 30 13:59:21 crc kubenswrapper[4936]: I0930 13:59:21.701064 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae","Type":"ContainerStarted","Data":"3a8204d3c0a9a38cd25918e2de12f1551e072bafd3039741d55bc57064d74aa7"} Sep 30 13:59:22 crc kubenswrapper[4936]: I0930 13:59:22.711068 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50b2b4d1-70a1-4044-b585-721988a65bda","Type":"ContainerStarted","Data":"fe04b181eaeb2940c239f60999bd0297ea9ac02342180a82a7c2e28eb262d643"} Sep 30 13:59:22 crc kubenswrapper[4936]: I0930 13:59:22.712933 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae","Type":"ContainerStarted","Data":"259d786492de454dc16673fb83671196d8f9ec32d67e3faa7e5cb25a369ace42"} Sep 30 13:59:22 crc kubenswrapper[4936]: I0930 13:59:22.713055 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Sep 30 13:59:22 crc kubenswrapper[4936]: I0930 13:59:22.750245 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.750218382 podStartE2EDuration="2.750218382s" podCreationTimestamp="2025-09-30 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:22.726703919 +0000 UTC m=+1213.110706240" watchObservedRunningTime="2025-09-30 13:59:22.750218382 +0000 UTC m=+1213.134220683" Sep 30 13:59:23 crc kubenswrapper[4936]: I0930 13:59:23.722786 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50b2b4d1-70a1-4044-b585-721988a65bda","Type":"ContainerStarted","Data":"8229c455cf5d455225fdadfce1a102d5c128eca021c868080ab39c7ba1fc7615"} Sep 30 13:59:24 crc kubenswrapper[4936]: I0930 13:59:24.735327 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50b2b4d1-70a1-4044-b585-721988a65bda","Type":"ContainerStarted","Data":"5ee267d93562e60efe3162a1223ae465a469b5aa8b3a365971338fc55749266c"} Sep 30 13:59:24 crc kubenswrapper[4936]: I0930 13:59:24.735937 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 13:59:24 crc kubenswrapper[4936]: I0930 13:59:24.767457 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.987867993 podStartE2EDuration="5.767433881s" podCreationTimestamp="2025-09-30 13:59:19 +0000 UTC" firstStartedPulling="2025-09-30 13:59:20.587161089 +0000 UTC m=+1210.971163390" lastFinishedPulling="2025-09-30 13:59:24.366726977 +0000 UTC m=+1214.750729278" observedRunningTime="2025-09-30 13:59:24.76446533 +0000 UTC m=+1215.148467651" watchObservedRunningTime="2025-09-30 13:59:24.767433881 +0000 UTC m=+1215.151436182" Sep 30 13:59:26 crc kubenswrapper[4936]: I0930 13:59:26.173695 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Sep 30 13:59:26 crc kubenswrapper[4936]: I0930 13:59:26.708986 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-j5vlj"] Sep 30 13:59:26 crc kubenswrapper[4936]: I0930 13:59:26.710469 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-j5vlj" Sep 30 13:59:26 crc kubenswrapper[4936]: I0930 13:59:26.714877 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Sep 30 13:59:26 crc kubenswrapper[4936]: I0930 13:59:26.714957 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Sep 30 13:59:26 crc kubenswrapper[4936]: I0930 13:59:26.730442 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-j5vlj"] Sep 30 13:59:26 crc kubenswrapper[4936]: I0930 13:59:26.818611 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca906ecd-ab9e-433a-982b-79ca504b6085-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-j5vlj\" (UID: \"ca906ecd-ab9e-433a-982b-79ca504b6085\") " pod="openstack/nova-cell0-cell-mapping-j5vlj" Sep 30 13:59:26 crc kubenswrapper[4936]: I0930 13:59:26.818669 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p776r\" (UniqueName: \"kubernetes.io/projected/ca906ecd-ab9e-433a-982b-79ca504b6085-kube-api-access-p776r\") pod \"nova-cell0-cell-mapping-j5vlj\" (UID: \"ca906ecd-ab9e-433a-982b-79ca504b6085\") " pod="openstack/nova-cell0-cell-mapping-j5vlj" Sep 30 13:59:26 crc kubenswrapper[4936]: I0930 13:59:26.818902 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca906ecd-ab9e-433a-982b-79ca504b6085-config-data\") pod \"nova-cell0-cell-mapping-j5vlj\" (UID: \"ca906ecd-ab9e-433a-982b-79ca504b6085\") " pod="openstack/nova-cell0-cell-mapping-j5vlj" Sep 30 13:59:26 crc kubenswrapper[4936]: I0930 13:59:26.818947 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca906ecd-ab9e-433a-982b-79ca504b6085-scripts\") pod \"nova-cell0-cell-mapping-j5vlj\" (UID: \"ca906ecd-ab9e-433a-982b-79ca504b6085\") " pod="openstack/nova-cell0-cell-mapping-j5vlj" Sep 30 13:59:26 crc kubenswrapper[4936]: I0930 13:59:26.896133 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 13:59:26 crc kubenswrapper[4936]: I0930 13:59:26.936819 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:59:26 crc kubenswrapper[4936]: I0930 13:59:26.942659 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca906ecd-ab9e-433a-982b-79ca504b6085-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-j5vlj\" (UID: \"ca906ecd-ab9e-433a-982b-79ca504b6085\") " pod="openstack/nova-cell0-cell-mapping-j5vlj" Sep 30 13:59:26 crc kubenswrapper[4936]: I0930 13:59:26.942712 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p776r\" (UniqueName: \"kubernetes.io/projected/ca906ecd-ab9e-433a-982b-79ca504b6085-kube-api-access-p776r\") pod \"nova-cell0-cell-mapping-j5vlj\" (UID: \"ca906ecd-ab9e-433a-982b-79ca504b6085\") " pod="openstack/nova-cell0-cell-mapping-j5vlj" Sep 30 13:59:26 crc kubenswrapper[4936]: I0930 13:59:26.942799 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca906ecd-ab9e-433a-982b-79ca504b6085-config-data\") pod \"nova-cell0-cell-mapping-j5vlj\" (UID: \"ca906ecd-ab9e-433a-982b-79ca504b6085\") " pod="openstack/nova-cell0-cell-mapping-j5vlj" Sep 30 13:59:26 crc kubenswrapper[4936]: I0930 13:59:26.942823 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca906ecd-ab9e-433a-982b-79ca504b6085-scripts\") pod \"nova-cell0-cell-mapping-j5vlj\" (UID: \"ca906ecd-ab9e-433a-982b-79ca504b6085\") " pod="openstack/nova-cell0-cell-mapping-j5vlj" Sep 30 13:59:26 crc kubenswrapper[4936]: I0930 13:59:26.960183 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca906ecd-ab9e-433a-982b-79ca504b6085-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-j5vlj\" (UID: \"ca906ecd-ab9e-433a-982b-79ca504b6085\") " pod="openstack/nova-cell0-cell-mapping-j5vlj" Sep 30 13:59:26 crc kubenswrapper[4936]: I0930 13:59:26.964830 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca906ecd-ab9e-433a-982b-79ca504b6085-scripts\") pod \"nova-cell0-cell-mapping-j5vlj\" (UID: \"ca906ecd-ab9e-433a-982b-79ca504b6085\") " pod="openstack/nova-cell0-cell-mapping-j5vlj" Sep 30 13:59:26 crc kubenswrapper[4936]: I0930 13:59:26.980355 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca906ecd-ab9e-433a-982b-79ca504b6085-config-data\") pod \"nova-cell0-cell-mapping-j5vlj\" (UID: \"ca906ecd-ab9e-433a-982b-79ca504b6085\") " pod="openstack/nova-cell0-cell-mapping-j5vlj" Sep 30 13:59:26 crc kubenswrapper[4936]: I0930 13:59:26.984695 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.017964 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p776r\" (UniqueName: \"kubernetes.io/projected/ca906ecd-ab9e-433a-982b-79ca504b6085-kube-api-access-p776r\") pod \"nova-cell0-cell-mapping-j5vlj\" (UID: \"ca906ecd-ab9e-433a-982b-79ca504b6085\") " pod="openstack/nova-cell0-cell-mapping-j5vlj" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.040564 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-j5vlj" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.045234 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/784bde60-fe66-4bf1-8cc3-0d66acb994c0-logs\") pod \"nova-api-0\" (UID: \"784bde60-fe66-4bf1-8cc3-0d66acb994c0\") " pod="openstack/nova-api-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.045359 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc9dl\" (UniqueName: \"kubernetes.io/projected/784bde60-fe66-4bf1-8cc3-0d66acb994c0-kube-api-access-nc9dl\") pod \"nova-api-0\" (UID: \"784bde60-fe66-4bf1-8cc3-0d66acb994c0\") " pod="openstack/nova-api-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.045421 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/784bde60-fe66-4bf1-8cc3-0d66acb994c0-config-data\") pod \"nova-api-0\" (UID: \"784bde60-fe66-4bf1-8cc3-0d66acb994c0\") " pod="openstack/nova-api-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.045513 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/784bde60-fe66-4bf1-8cc3-0d66acb994c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"784bde60-fe66-4bf1-8cc3-0d66acb994c0\") " pod="openstack/nova-api-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.063838 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.115398 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.116932 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.132746 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.148132 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/784bde60-fe66-4bf1-8cc3-0d66acb994c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"784bde60-fe66-4bf1-8cc3-0d66acb994c0\") " pod="openstack/nova-api-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.148217 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/784bde60-fe66-4bf1-8cc3-0d66acb994c0-logs\") pod \"nova-api-0\" (UID: \"784bde60-fe66-4bf1-8cc3-0d66acb994c0\") " pod="openstack/nova-api-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.148260 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc9dl\" (UniqueName: \"kubernetes.io/projected/784bde60-fe66-4bf1-8cc3-0d66acb994c0-kube-api-access-nc9dl\") pod \"nova-api-0\" (UID: \"784bde60-fe66-4bf1-8cc3-0d66acb994c0\") " pod="openstack/nova-api-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.148291 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/784bde60-fe66-4bf1-8cc3-0d66acb994c0-config-data\") pod \"nova-api-0\" (UID: \"784bde60-fe66-4bf1-8cc3-0d66acb994c0\") " pod="openstack/nova-api-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.153454 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/784bde60-fe66-4bf1-8cc3-0d66acb994c0-logs\") pod \"nova-api-0\" (UID: \"784bde60-fe66-4bf1-8cc3-0d66acb994c0\") " pod="openstack/nova-api-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.156285 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/784bde60-fe66-4bf1-8cc3-0d66acb994c0-config-data\") pod \"nova-api-0\" (UID: \"784bde60-fe66-4bf1-8cc3-0d66acb994c0\") " pod="openstack/nova-api-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.164859 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/784bde60-fe66-4bf1-8cc3-0d66acb994c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"784bde60-fe66-4bf1-8cc3-0d66acb994c0\") " pod="openstack/nova-api-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.209790 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.219612 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc9dl\" (UniqueName: \"kubernetes.io/projected/784bde60-fe66-4bf1-8cc3-0d66acb994c0-kube-api-access-nc9dl\") pod \"nova-api-0\" (UID: \"784bde60-fe66-4bf1-8cc3-0d66acb994c0\") " pod="openstack/nova-api-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.239476 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.240902 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.252473 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.253904 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/989a5501-f4e5-43b6-b12d-658a5f5db28b-logs\") pod \"nova-metadata-0\" (UID: \"989a5501-f4e5-43b6-b12d-658a5f5db28b\") " pod="openstack/nova-metadata-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.253938 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glgb4\" (UniqueName: \"kubernetes.io/projected/989a5501-f4e5-43b6-b12d-658a5f5db28b-kube-api-access-glgb4\") pod \"nova-metadata-0\" (UID: \"989a5501-f4e5-43b6-b12d-658a5f5db28b\") " pod="openstack/nova-metadata-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.254020 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/989a5501-f4e5-43b6-b12d-658a5f5db28b-config-data\") pod \"nova-metadata-0\" (UID: \"989a5501-f4e5-43b6-b12d-658a5f5db28b\") " pod="openstack/nova-metadata-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.254075 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/989a5501-f4e5-43b6-b12d-658a5f5db28b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"989a5501-f4e5-43b6-b12d-658a5f5db28b\") " pod="openstack/nova-metadata-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.254679 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.260525 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.315354 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-z47ww"] Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.318684 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.325028 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-z47ww"] Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.355455 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/989a5501-f4e5-43b6-b12d-658a5f5db28b-config-data\") pod \"nova-metadata-0\" (UID: \"989a5501-f4e5-43b6-b12d-658a5f5db28b\") " pod="openstack/nova-metadata-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.355489 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfj92\" (UniqueName: \"kubernetes.io/projected/40308063-f8c5-4aae-99be-736964f4ec37-kube-api-access-qfj92\") pod \"nova-scheduler-0\" (UID: \"40308063-f8c5-4aae-99be-736964f4ec37\") " pod="openstack/nova-scheduler-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.355513 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40308063-f8c5-4aae-99be-736964f4ec37-config-data\") pod \"nova-scheduler-0\" (UID: \"40308063-f8c5-4aae-99be-736964f4ec37\") " pod="openstack/nova-scheduler-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.355577 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40308063-f8c5-4aae-99be-736964f4ec37-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"40308063-f8c5-4aae-99be-736964f4ec37\") " pod="openstack/nova-scheduler-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.355603 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/989a5501-f4e5-43b6-b12d-658a5f5db28b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"989a5501-f4e5-43b6-b12d-658a5f5db28b\") " pod="openstack/nova-metadata-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.355640 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/989a5501-f4e5-43b6-b12d-658a5f5db28b-logs\") pod \"nova-metadata-0\" (UID: \"989a5501-f4e5-43b6-b12d-658a5f5db28b\") " pod="openstack/nova-metadata-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.355657 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glgb4\" (UniqueName: \"kubernetes.io/projected/989a5501-f4e5-43b6-b12d-658a5f5db28b-kube-api-access-glgb4\") pod \"nova-metadata-0\" (UID: \"989a5501-f4e5-43b6-b12d-658a5f5db28b\") " pod="openstack/nova-metadata-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.365063 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.367430 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/989a5501-f4e5-43b6-b12d-658a5f5db28b-logs\") pod \"nova-metadata-0\" (UID: \"989a5501-f4e5-43b6-b12d-658a5f5db28b\") " pod="openstack/nova-metadata-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.373921 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/989a5501-f4e5-43b6-b12d-658a5f5db28b-config-data\") pod \"nova-metadata-0\" (UID: \"989a5501-f4e5-43b6-b12d-658a5f5db28b\") " pod="openstack/nova-metadata-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.376914 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.379441 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/989a5501-f4e5-43b6-b12d-658a5f5db28b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"989a5501-f4e5-43b6-b12d-658a5f5db28b\") " pod="openstack/nova-metadata-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.381140 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.387619 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.454128 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glgb4\" (UniqueName: \"kubernetes.io/projected/989a5501-f4e5-43b6-b12d-658a5f5db28b-kube-api-access-glgb4\") pod \"nova-metadata-0\" (UID: \"989a5501-f4e5-43b6-b12d-658a5f5db28b\") " pod="openstack/nova-metadata-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.457615 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17dcf6d7-edd8-49ea-8bd3-83e59c211ef9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"17dcf6d7-edd8-49ea-8bd3-83e59c211ef9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.457672 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40308063-f8c5-4aae-99be-736964f4ec37-config-data\") pod \"nova-scheduler-0\" (UID: \"40308063-f8c5-4aae-99be-736964f4ec37\") " pod="openstack/nova-scheduler-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.457731 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpwn2\" (UniqueName: \"kubernetes.io/projected/ce0b046e-7c77-461b-ad62-1ece23a1225c-kube-api-access-bpwn2\") pod \"dnsmasq-dns-566b5b7845-z47ww\" (UID: \"ce0b046e-7c77-461b-ad62-1ece23a1225c\") " pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.457762 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40308063-f8c5-4aae-99be-736964f4ec37-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"40308063-f8c5-4aae-99be-736964f4ec37\") " pod="openstack/nova-scheduler-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.457808 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-config\") pod \"dnsmasq-dns-566b5b7845-z47ww\" (UID: \"ce0b046e-7c77-461b-ad62-1ece23a1225c\") " pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.457835 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17dcf6d7-edd8-49ea-8bd3-83e59c211ef9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"17dcf6d7-edd8-49ea-8bd3-83e59c211ef9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.457861 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-dns-svc\") pod \"dnsmasq-dns-566b5b7845-z47ww\" (UID: \"ce0b046e-7c77-461b-ad62-1ece23a1225c\") " pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.457882 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-z47ww\" (UID: \"ce0b046e-7c77-461b-ad62-1ece23a1225c\") " pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.457907 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-z47ww\" (UID: \"ce0b046e-7c77-461b-ad62-1ece23a1225c\") " pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.457962 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfztk\" (UniqueName: \"kubernetes.io/projected/17dcf6d7-edd8-49ea-8bd3-83e59c211ef9-kube-api-access-xfztk\") pod \"nova-cell1-novncproxy-0\" (UID: \"17dcf6d7-edd8-49ea-8bd3-83e59c211ef9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.461532 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfj92\" (UniqueName: \"kubernetes.io/projected/40308063-f8c5-4aae-99be-736964f4ec37-kube-api-access-qfj92\") pod \"nova-scheduler-0\" (UID: \"40308063-f8c5-4aae-99be-736964f4ec37\") " pod="openstack/nova-scheduler-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.468934 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40308063-f8c5-4aae-99be-736964f4ec37-config-data\") pod \"nova-scheduler-0\" (UID: \"40308063-f8c5-4aae-99be-736964f4ec37\") " pod="openstack/nova-scheduler-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.470981 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40308063-f8c5-4aae-99be-736964f4ec37-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"40308063-f8c5-4aae-99be-736964f4ec37\") " pod="openstack/nova-scheduler-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.490350 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfj92\" (UniqueName: \"kubernetes.io/projected/40308063-f8c5-4aae-99be-736964f4ec37-kube-api-access-qfj92\") pod \"nova-scheduler-0\" (UID: \"40308063-f8c5-4aae-99be-736964f4ec37\") " pod="openstack/nova-scheduler-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.562697 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-config\") pod \"dnsmasq-dns-566b5b7845-z47ww\" (UID: \"ce0b046e-7c77-461b-ad62-1ece23a1225c\") " pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.562771 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17dcf6d7-edd8-49ea-8bd3-83e59c211ef9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"17dcf6d7-edd8-49ea-8bd3-83e59c211ef9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.562813 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-dns-svc\") pod \"dnsmasq-dns-566b5b7845-z47ww\" (UID: \"ce0b046e-7c77-461b-ad62-1ece23a1225c\") " pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.562837 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-z47ww\" (UID: \"ce0b046e-7c77-461b-ad62-1ece23a1225c\") " pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.562863 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-z47ww\" (UID: \"ce0b046e-7c77-461b-ad62-1ece23a1225c\") " pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.562929 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfztk\" (UniqueName: \"kubernetes.io/projected/17dcf6d7-edd8-49ea-8bd3-83e59c211ef9-kube-api-access-xfztk\") pod \"nova-cell1-novncproxy-0\" (UID: \"17dcf6d7-edd8-49ea-8bd3-83e59c211ef9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.562966 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17dcf6d7-edd8-49ea-8bd3-83e59c211ef9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"17dcf6d7-edd8-49ea-8bd3-83e59c211ef9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.563009 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpwn2\" (UniqueName: \"kubernetes.io/projected/ce0b046e-7c77-461b-ad62-1ece23a1225c-kube-api-access-bpwn2\") pod \"dnsmasq-dns-566b5b7845-z47ww\" (UID: \"ce0b046e-7c77-461b-ad62-1ece23a1225c\") " pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.564438 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-z47ww\" (UID: \"ce0b046e-7c77-461b-ad62-1ece23a1225c\") " pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.564459 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-z47ww\" (UID: \"ce0b046e-7c77-461b-ad62-1ece23a1225c\") " pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.569535 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-config\") pod \"dnsmasq-dns-566b5b7845-z47ww\" (UID: \"ce0b046e-7c77-461b-ad62-1ece23a1225c\") " pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.571094 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-dns-svc\") pod \"dnsmasq-dns-566b5b7845-z47ww\" (UID: \"ce0b046e-7c77-461b-ad62-1ece23a1225c\") " pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.576790 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17dcf6d7-edd8-49ea-8bd3-83e59c211ef9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"17dcf6d7-edd8-49ea-8bd3-83e59c211ef9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.576924 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17dcf6d7-edd8-49ea-8bd3-83e59c211ef9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"17dcf6d7-edd8-49ea-8bd3-83e59c211ef9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.584686 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfztk\" (UniqueName: \"kubernetes.io/projected/17dcf6d7-edd8-49ea-8bd3-83e59c211ef9-kube-api-access-xfztk\") pod \"nova-cell1-novncproxy-0\" (UID: \"17dcf6d7-edd8-49ea-8bd3-83e59c211ef9\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.586229 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpwn2\" (UniqueName: \"kubernetes.io/projected/ce0b046e-7c77-461b-ad62-1ece23a1225c-kube-api-access-bpwn2\") pod \"dnsmasq-dns-566b5b7845-z47ww\" (UID: \"ce0b046e-7c77-461b-ad62-1ece23a1225c\") " pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.587832 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.613122 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.697768 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.717812 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:59:27 crc kubenswrapper[4936]: I0930 13:59:27.872881 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-j5vlj"] Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.013590 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:59:28 crc kubenswrapper[4936]: W0930 13:59:28.019255 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod784bde60_fe66_4bf1_8cc3_0d66acb994c0.slice/crio-b6b7d920228ef51657c26cd39f0500d2add8b7b4b4bef0ca843fab08bf98b3ad WatchSource:0}: Error finding container b6b7d920228ef51657c26cd39f0500d2add8b7b4b4bef0ca843fab08bf98b3ad: Status 404 returned error can't find the container with id b6b7d920228ef51657c26cd39f0500d2add8b7b4b4bef0ca843fab08bf98b3ad Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.181025 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.438037 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.543169 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-z47ww"] Sep 30 13:59:28 crc kubenswrapper[4936]: W0930 13:59:28.560931 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce0b046e_7c77_461b_ad62_1ece23a1225c.slice/crio-329fba4d9645d125ddb64e04ffdc949e645be2a6cf374cdab0084cea690edf56 WatchSource:0}: Error finding container 329fba4d9645d125ddb64e04ffdc949e645be2a6cf374cdab0084cea690edf56: Status 404 returned error can't find the container with id 329fba4d9645d125ddb64e04ffdc949e645be2a6cf374cdab0084cea690edf56 Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.607970 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.621219 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-n9w76"] Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.632571 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-n9w76"] Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.632683 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-n9w76" Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.641198 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.641262 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.701083 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe466675-627d-41b4-910f-921c06848e48-config-data\") pod \"nova-cell1-conductor-db-sync-n9w76\" (UID: \"fe466675-627d-41b4-910f-921c06848e48\") " pod="openstack/nova-cell1-conductor-db-sync-n9w76" Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.701126 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hdzc\" (UniqueName: \"kubernetes.io/projected/fe466675-627d-41b4-910f-921c06848e48-kube-api-access-6hdzc\") pod \"nova-cell1-conductor-db-sync-n9w76\" (UID: \"fe466675-627d-41b4-910f-921c06848e48\") " pod="openstack/nova-cell1-conductor-db-sync-n9w76" Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.701220 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe466675-627d-41b4-910f-921c06848e48-scripts\") pod \"nova-cell1-conductor-db-sync-n9w76\" (UID: \"fe466675-627d-41b4-910f-921c06848e48\") " pod="openstack/nova-cell1-conductor-db-sync-n9w76" Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.701294 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe466675-627d-41b4-910f-921c06848e48-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-n9w76\" (UID: \"fe466675-627d-41b4-910f-921c06848e48\") " pod="openstack/nova-cell1-conductor-db-sync-n9w76" Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.803620 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"784bde60-fe66-4bf1-8cc3-0d66acb994c0","Type":"ContainerStarted","Data":"b6b7d920228ef51657c26cd39f0500d2add8b7b4b4bef0ca843fab08bf98b3ad"} Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.807241 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe466675-627d-41b4-910f-921c06848e48-config-data\") pod \"nova-cell1-conductor-db-sync-n9w76\" (UID: \"fe466675-627d-41b4-910f-921c06848e48\") " pod="openstack/nova-cell1-conductor-db-sync-n9w76" Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.807410 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hdzc\" (UniqueName: \"kubernetes.io/projected/fe466675-627d-41b4-910f-921c06848e48-kube-api-access-6hdzc\") pod \"nova-cell1-conductor-db-sync-n9w76\" (UID: \"fe466675-627d-41b4-910f-921c06848e48\") " pod="openstack/nova-cell1-conductor-db-sync-n9w76" Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.807711 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe466675-627d-41b4-910f-921c06848e48-scripts\") pod \"nova-cell1-conductor-db-sync-n9w76\" (UID: \"fe466675-627d-41b4-910f-921c06848e48\") " pod="openstack/nova-cell1-conductor-db-sync-n9w76" Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.807769 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe466675-627d-41b4-910f-921c06848e48-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-n9w76\" (UID: \"fe466675-627d-41b4-910f-921c06848e48\") " pod="openstack/nova-cell1-conductor-db-sync-n9w76" Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.811420 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"989a5501-f4e5-43b6-b12d-658a5f5db28b","Type":"ContainerStarted","Data":"f6e34487814a93512024793d0996efd1a6751b52bc86315f8f26fab4b2fe1fd3"} Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.815243 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe466675-627d-41b4-910f-921c06848e48-config-data\") pod \"nova-cell1-conductor-db-sync-n9w76\" (UID: \"fe466675-627d-41b4-910f-921c06848e48\") " pod="openstack/nova-cell1-conductor-db-sync-n9w76" Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.816047 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe466675-627d-41b4-910f-921c06848e48-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-n9w76\" (UID: \"fe466675-627d-41b4-910f-921c06848e48\") " pod="openstack/nova-cell1-conductor-db-sync-n9w76" Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.816050 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe466675-627d-41b4-910f-921c06848e48-scripts\") pod \"nova-cell1-conductor-db-sync-n9w76\" (UID: \"fe466675-627d-41b4-910f-921c06848e48\") " pod="openstack/nova-cell1-conductor-db-sync-n9w76" Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.819425 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"40308063-f8c5-4aae-99be-736964f4ec37","Type":"ContainerStarted","Data":"65be52be26cbbac05626bf1f0f98179209525b31191abd43a6f45f84dae21587"} Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.830642 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"17dcf6d7-edd8-49ea-8bd3-83e59c211ef9","Type":"ContainerStarted","Data":"0e56b8915e3b3b0cf2cedfd4a79623bbdc6dd3628bf4a3460df5092fffa4e145"} Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.830881 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hdzc\" (UniqueName: \"kubernetes.io/projected/fe466675-627d-41b4-910f-921c06848e48-kube-api-access-6hdzc\") pod \"nova-cell1-conductor-db-sync-n9w76\" (UID: \"fe466675-627d-41b4-910f-921c06848e48\") " pod="openstack/nova-cell1-conductor-db-sync-n9w76" Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.833056 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-j5vlj" event={"ID":"ca906ecd-ab9e-433a-982b-79ca504b6085","Type":"ContainerStarted","Data":"a345b11302e09c7538e7e71958d7fe5df5b465ae48da0abdada4eb967b37e2e7"} Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.833090 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-j5vlj" event={"ID":"ca906ecd-ab9e-433a-982b-79ca504b6085","Type":"ContainerStarted","Data":"456fac78fa7d79851288d5df9f29fa9942d20d31c19b22390a9f40d6606a651c"} Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.836165 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-z47ww" event={"ID":"ce0b046e-7c77-461b-ad62-1ece23a1225c","Type":"ContainerStarted","Data":"329fba4d9645d125ddb64e04ffdc949e645be2a6cf374cdab0084cea690edf56"} Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.858392 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-j5vlj" podStartSLOduration=2.8583719739999998 podStartE2EDuration="2.858371974s" podCreationTimestamp="2025-09-30 13:59:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:28.85127313 +0000 UTC m=+1219.235275431" watchObservedRunningTime="2025-09-30 13:59:28.858371974 +0000 UTC m=+1219.242374275" Sep 30 13:59:28 crc kubenswrapper[4936]: I0930 13:59:28.957530 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-n9w76" Sep 30 13:59:29 crc kubenswrapper[4936]: I0930 13:59:29.628239 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-n9w76"] Sep 30 13:59:29 crc kubenswrapper[4936]: W0930 13:59:29.702078 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe466675_627d_41b4_910f_921c06848e48.slice/crio-fa017b5ebec8ff362dfef01be333fa42a013e4e0f2ad97320f31c739cbca9a93 WatchSource:0}: Error finding container fa017b5ebec8ff362dfef01be333fa42a013e4e0f2ad97320f31c739cbca9a93: Status 404 returned error can't find the container with id fa017b5ebec8ff362dfef01be333fa42a013e4e0f2ad97320f31c739cbca9a93 Sep 30 13:59:29 crc kubenswrapper[4936]: I0930 13:59:29.858427 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-n9w76" event={"ID":"fe466675-627d-41b4-910f-921c06848e48","Type":"ContainerStarted","Data":"fa017b5ebec8ff362dfef01be333fa42a013e4e0f2ad97320f31c739cbca9a93"} Sep 30 13:59:29 crc kubenswrapper[4936]: I0930 13:59:29.868705 4936 generic.go:334] "Generic (PLEG): container finished" podID="ce0b046e-7c77-461b-ad62-1ece23a1225c" containerID="0037111496d5c4e942d4dd609ec00d06a8d1b35d7263467ec401fc53cb6f385f" exitCode=0 Sep 30 13:59:29 crc kubenswrapper[4936]: I0930 13:59:29.871207 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-z47ww" event={"ID":"ce0b046e-7c77-461b-ad62-1ece23a1225c","Type":"ContainerDied","Data":"0037111496d5c4e942d4dd609ec00d06a8d1b35d7263467ec401fc53cb6f385f"} Sep 30 13:59:30 crc kubenswrapper[4936]: I0930 13:59:30.906672 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-n9w76" event={"ID":"fe466675-627d-41b4-910f-921c06848e48","Type":"ContainerStarted","Data":"f8a5d50b6dc48579a0b04fefd0890b3e70f3d4e0f3837f4127a0b4d9c9d096c7"} Sep 30 13:59:30 crc kubenswrapper[4936]: I0930 13:59:30.919937 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-z47ww" event={"ID":"ce0b046e-7c77-461b-ad62-1ece23a1225c","Type":"ContainerStarted","Data":"1c6de2f662fa7dee0c7e3dcd4c79caa5be921e0d31d4c1f921fad4a09d24940b"} Sep 30 13:59:30 crc kubenswrapper[4936]: I0930 13:59:30.920936 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 13:59:30 crc kubenswrapper[4936]: I0930 13:59:30.938963 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-n9w76" podStartSLOduration=2.938947165 podStartE2EDuration="2.938947165s" podCreationTimestamp="2025-09-30 13:59:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:30.937442624 +0000 UTC m=+1221.321444925" watchObservedRunningTime="2025-09-30 13:59:30.938947165 +0000 UTC m=+1221.322949466" Sep 30 13:59:30 crc kubenswrapper[4936]: I0930 13:59:30.961032 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:59:31 crc kubenswrapper[4936]: I0930 13:59:30.999880 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-566b5b7845-z47ww" podStartSLOduration=3.999861408 podStartE2EDuration="3.999861408s" podCreationTimestamp="2025-09-30 13:59:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:30.971833193 +0000 UTC m=+1221.355835504" watchObservedRunningTime="2025-09-30 13:59:30.999861408 +0000 UTC m=+1221.383863709" Sep 30 13:59:31 crc kubenswrapper[4936]: I0930 13:59:31.007223 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 13:59:33 crc kubenswrapper[4936]: I0930 13:59:33.968073 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"989a5501-f4e5-43b6-b12d-658a5f5db28b","Type":"ContainerStarted","Data":"09041aa8e2dd52a6e5ad348d83fe3291e80f421e985345851421642ff6223749"} Sep 30 13:59:33 crc kubenswrapper[4936]: I0930 13:59:33.986976 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="17dcf6d7-edd8-49ea-8bd3-83e59c211ef9" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e9e03e07e7d9ffdd8ec7520b8f927c09eaf501a02eda460804bda1e3a8fa6ea0" gracePeriod=30 Sep 30 13:59:34 crc kubenswrapper[4936]: I0930 13:59:34.017477 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.132397459 podStartE2EDuration="7.017458554s" podCreationTimestamp="2025-09-30 13:59:27 +0000 UTC" firstStartedPulling="2025-09-30 13:59:28.607397378 +0000 UTC m=+1218.991399679" lastFinishedPulling="2025-09-30 13:59:33.492458473 +0000 UTC m=+1223.876460774" observedRunningTime="2025-09-30 13:59:34.009661761 +0000 UTC m=+1224.393664052" watchObservedRunningTime="2025-09-30 13:59:34.017458554 +0000 UTC m=+1224.401460855" Sep 30 13:59:35 crc kubenswrapper[4936]: I0930 13:59:35.006190 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"784bde60-fe66-4bf1-8cc3-0d66acb994c0","Type":"ContainerStarted","Data":"ffc485bfbaf04068e74b6a544e2d50068aeb47e01d668b5a114203236d02cee1"} Sep 30 13:59:35 crc kubenswrapper[4936]: I0930 13:59:35.006533 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"784bde60-fe66-4bf1-8cc3-0d66acb994c0","Type":"ContainerStarted","Data":"00a77bee2aee051d57d7e2f7efdabdd1eb703226c83e80e69f3c8ef3eea6cc42"} Sep 30 13:59:35 crc kubenswrapper[4936]: I0930 13:59:35.008750 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"989a5501-f4e5-43b6-b12d-658a5f5db28b","Type":"ContainerStarted","Data":"aa37472a3f7e8504adfc90462667d55980f0ec5b87cb659753d81d96aa9b922f"} Sep 30 13:59:35 crc kubenswrapper[4936]: I0930 13:59:35.008887 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="989a5501-f4e5-43b6-b12d-658a5f5db28b" containerName="nova-metadata-log" containerID="cri-o://09041aa8e2dd52a6e5ad348d83fe3291e80f421e985345851421642ff6223749" gracePeriod=30 Sep 30 13:59:35 crc kubenswrapper[4936]: I0930 13:59:35.009030 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="989a5501-f4e5-43b6-b12d-658a5f5db28b" containerName="nova-metadata-metadata" containerID="cri-o://aa37472a3f7e8504adfc90462667d55980f0ec5b87cb659753d81d96aa9b922f" gracePeriod=30 Sep 30 13:59:35 crc kubenswrapper[4936]: I0930 13:59:35.017632 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"40308063-f8c5-4aae-99be-736964f4ec37","Type":"ContainerStarted","Data":"92312ed8d1b3f07c2c6914605cf2cae646136c280584f97e843607032f86fb6f"} Sep 30 13:59:35 crc kubenswrapper[4936]: I0930 13:59:35.021742 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"17dcf6d7-edd8-49ea-8bd3-83e59c211ef9","Type":"ContainerStarted","Data":"e9e03e07e7d9ffdd8ec7520b8f927c09eaf501a02eda460804bda1e3a8fa6ea0"} Sep 30 13:59:35 crc kubenswrapper[4936]: I0930 13:59:35.039778 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.6018878340000002 podStartE2EDuration="9.039755848s" podCreationTimestamp="2025-09-30 13:59:26 +0000 UTC" firstStartedPulling="2025-09-30 13:59:28.047262049 +0000 UTC m=+1218.431264350" lastFinishedPulling="2025-09-30 13:59:33.485130063 +0000 UTC m=+1223.869132364" observedRunningTime="2025-09-30 13:59:35.033541228 +0000 UTC m=+1225.417543539" watchObservedRunningTime="2025-09-30 13:59:35.039755848 +0000 UTC m=+1225.423758149" Sep 30 13:59:35 crc kubenswrapper[4936]: I0930 13:59:35.062924 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.773026339 podStartE2EDuration="9.062905221s" podCreationTimestamp="2025-09-30 13:59:26 +0000 UTC" firstStartedPulling="2025-09-30 13:59:28.19668137 +0000 UTC m=+1218.580683671" lastFinishedPulling="2025-09-30 13:59:33.486560252 +0000 UTC m=+1223.870562553" observedRunningTime="2025-09-30 13:59:35.060087384 +0000 UTC m=+1225.444089705" watchObservedRunningTime="2025-09-30 13:59:35.062905221 +0000 UTC m=+1225.446907512" Sep 30 13:59:35 crc kubenswrapper[4936]: I0930 13:59:35.095163 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.049639084 podStartE2EDuration="8.095140811s" podCreationTimestamp="2025-09-30 13:59:27 +0000 UTC" firstStartedPulling="2025-09-30 13:59:28.439653686 +0000 UTC m=+1218.823655987" lastFinishedPulling="2025-09-30 13:59:33.485155413 +0000 UTC m=+1223.869157714" observedRunningTime="2025-09-30 13:59:35.085897029 +0000 UTC m=+1225.469899330" watchObservedRunningTime="2025-09-30 13:59:35.095140811 +0000 UTC m=+1225.479143112" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.007755 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.033138 4936 generic.go:334] "Generic (PLEG): container finished" podID="989a5501-f4e5-43b6-b12d-658a5f5db28b" containerID="aa37472a3f7e8504adfc90462667d55980f0ec5b87cb659753d81d96aa9b922f" exitCode=0 Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.033185 4936 generic.go:334] "Generic (PLEG): container finished" podID="989a5501-f4e5-43b6-b12d-658a5f5db28b" containerID="09041aa8e2dd52a6e5ad348d83fe3291e80f421e985345851421642ff6223749" exitCode=143 Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.033505 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"989a5501-f4e5-43b6-b12d-658a5f5db28b","Type":"ContainerDied","Data":"aa37472a3f7e8504adfc90462667d55980f0ec5b87cb659753d81d96aa9b922f"} Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.033557 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"989a5501-f4e5-43b6-b12d-658a5f5db28b","Type":"ContainerDied","Data":"09041aa8e2dd52a6e5ad348d83fe3291e80f421e985345851421642ff6223749"} Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.033569 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"989a5501-f4e5-43b6-b12d-658a5f5db28b","Type":"ContainerDied","Data":"f6e34487814a93512024793d0996efd1a6751b52bc86315f8f26fab4b2fe1fd3"} Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.033584 4936 scope.go:117] "RemoveContainer" containerID="aa37472a3f7e8504adfc90462667d55980f0ec5b87cb659753d81d96aa9b922f" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.033745 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.062848 4936 scope.go:117] "RemoveContainer" containerID="09041aa8e2dd52a6e5ad348d83fe3291e80f421e985345851421642ff6223749" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.071886 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/989a5501-f4e5-43b6-b12d-658a5f5db28b-logs\") pod \"989a5501-f4e5-43b6-b12d-658a5f5db28b\" (UID: \"989a5501-f4e5-43b6-b12d-658a5f5db28b\") " Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.071955 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/989a5501-f4e5-43b6-b12d-658a5f5db28b-combined-ca-bundle\") pod \"989a5501-f4e5-43b6-b12d-658a5f5db28b\" (UID: \"989a5501-f4e5-43b6-b12d-658a5f5db28b\") " Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.072030 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glgb4\" (UniqueName: \"kubernetes.io/projected/989a5501-f4e5-43b6-b12d-658a5f5db28b-kube-api-access-glgb4\") pod \"989a5501-f4e5-43b6-b12d-658a5f5db28b\" (UID: \"989a5501-f4e5-43b6-b12d-658a5f5db28b\") " Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.072133 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/989a5501-f4e5-43b6-b12d-658a5f5db28b-config-data\") pod \"989a5501-f4e5-43b6-b12d-658a5f5db28b\" (UID: \"989a5501-f4e5-43b6-b12d-658a5f5db28b\") " Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.079259 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/989a5501-f4e5-43b6-b12d-658a5f5db28b-logs" (OuterVolumeSpecName: "logs") pod "989a5501-f4e5-43b6-b12d-658a5f5db28b" (UID: "989a5501-f4e5-43b6-b12d-658a5f5db28b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.100137 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/989a5501-f4e5-43b6-b12d-658a5f5db28b-kube-api-access-glgb4" (OuterVolumeSpecName: "kube-api-access-glgb4") pod "989a5501-f4e5-43b6-b12d-658a5f5db28b" (UID: "989a5501-f4e5-43b6-b12d-658a5f5db28b"). InnerVolumeSpecName "kube-api-access-glgb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.131562 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/989a5501-f4e5-43b6-b12d-658a5f5db28b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "989a5501-f4e5-43b6-b12d-658a5f5db28b" (UID: "989a5501-f4e5-43b6-b12d-658a5f5db28b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.139563 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/989a5501-f4e5-43b6-b12d-658a5f5db28b-config-data" (OuterVolumeSpecName: "config-data") pod "989a5501-f4e5-43b6-b12d-658a5f5db28b" (UID: "989a5501-f4e5-43b6-b12d-658a5f5db28b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.158133 4936 scope.go:117] "RemoveContainer" containerID="aa37472a3f7e8504adfc90462667d55980f0ec5b87cb659753d81d96aa9b922f" Sep 30 13:59:36 crc kubenswrapper[4936]: E0930 13:59:36.161030 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa37472a3f7e8504adfc90462667d55980f0ec5b87cb659753d81d96aa9b922f\": container with ID starting with aa37472a3f7e8504adfc90462667d55980f0ec5b87cb659753d81d96aa9b922f not found: ID does not exist" containerID="aa37472a3f7e8504adfc90462667d55980f0ec5b87cb659753d81d96aa9b922f" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.161174 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa37472a3f7e8504adfc90462667d55980f0ec5b87cb659753d81d96aa9b922f"} err="failed to get container status \"aa37472a3f7e8504adfc90462667d55980f0ec5b87cb659753d81d96aa9b922f\": rpc error: code = NotFound desc = could not find container \"aa37472a3f7e8504adfc90462667d55980f0ec5b87cb659753d81d96aa9b922f\": container with ID starting with aa37472a3f7e8504adfc90462667d55980f0ec5b87cb659753d81d96aa9b922f not found: ID does not exist" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.161288 4936 scope.go:117] "RemoveContainer" containerID="09041aa8e2dd52a6e5ad348d83fe3291e80f421e985345851421642ff6223749" Sep 30 13:59:36 crc kubenswrapper[4936]: E0930 13:59:36.164968 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09041aa8e2dd52a6e5ad348d83fe3291e80f421e985345851421642ff6223749\": container with ID starting with 09041aa8e2dd52a6e5ad348d83fe3291e80f421e985345851421642ff6223749 not found: ID does not exist" containerID="09041aa8e2dd52a6e5ad348d83fe3291e80f421e985345851421642ff6223749" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.165125 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09041aa8e2dd52a6e5ad348d83fe3291e80f421e985345851421642ff6223749"} err="failed to get container status \"09041aa8e2dd52a6e5ad348d83fe3291e80f421e985345851421642ff6223749\": rpc error: code = NotFound desc = could not find container \"09041aa8e2dd52a6e5ad348d83fe3291e80f421e985345851421642ff6223749\": container with ID starting with 09041aa8e2dd52a6e5ad348d83fe3291e80f421e985345851421642ff6223749 not found: ID does not exist" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.165211 4936 scope.go:117] "RemoveContainer" containerID="aa37472a3f7e8504adfc90462667d55980f0ec5b87cb659753d81d96aa9b922f" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.166139 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa37472a3f7e8504adfc90462667d55980f0ec5b87cb659753d81d96aa9b922f"} err="failed to get container status \"aa37472a3f7e8504adfc90462667d55980f0ec5b87cb659753d81d96aa9b922f\": rpc error: code = NotFound desc = could not find container \"aa37472a3f7e8504adfc90462667d55980f0ec5b87cb659753d81d96aa9b922f\": container with ID starting with aa37472a3f7e8504adfc90462667d55980f0ec5b87cb659753d81d96aa9b922f not found: ID does not exist" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.166245 4936 scope.go:117] "RemoveContainer" containerID="09041aa8e2dd52a6e5ad348d83fe3291e80f421e985345851421642ff6223749" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.166963 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09041aa8e2dd52a6e5ad348d83fe3291e80f421e985345851421642ff6223749"} err="failed to get container status \"09041aa8e2dd52a6e5ad348d83fe3291e80f421e985345851421642ff6223749\": rpc error: code = NotFound desc = could not find container \"09041aa8e2dd52a6e5ad348d83fe3291e80f421e985345851421642ff6223749\": container with ID starting with 09041aa8e2dd52a6e5ad348d83fe3291e80f421e985345851421642ff6223749 not found: ID does not exist" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.173948 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/989a5501-f4e5-43b6-b12d-658a5f5db28b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.173978 4936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/989a5501-f4e5-43b6-b12d-658a5f5db28b-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.173988 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/989a5501-f4e5-43b6-b12d-658a5f5db28b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.173997 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glgb4\" (UniqueName: \"kubernetes.io/projected/989a5501-f4e5-43b6-b12d-658a5f5db28b-kube-api-access-glgb4\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.373939 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.383830 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.405003 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:59:36 crc kubenswrapper[4936]: E0930 13:59:36.405469 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="989a5501-f4e5-43b6-b12d-658a5f5db28b" containerName="nova-metadata-metadata" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.405495 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="989a5501-f4e5-43b6-b12d-658a5f5db28b" containerName="nova-metadata-metadata" Sep 30 13:59:36 crc kubenswrapper[4936]: E0930 13:59:36.405522 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="989a5501-f4e5-43b6-b12d-658a5f5db28b" containerName="nova-metadata-log" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.405532 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="989a5501-f4e5-43b6-b12d-658a5f5db28b" containerName="nova-metadata-log" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.405761 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="989a5501-f4e5-43b6-b12d-658a5f5db28b" containerName="nova-metadata-log" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.405799 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="989a5501-f4e5-43b6-b12d-658a5f5db28b" containerName="nova-metadata-metadata" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.407033 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.415878 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.416150 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.426652 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.477588 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b540109c-7b24-438a-89e0-347fcee96d82-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b540109c-7b24-438a-89e0-347fcee96d82\") " pod="openstack/nova-metadata-0" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.477907 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hzxn\" (UniqueName: \"kubernetes.io/projected/b540109c-7b24-438a-89e0-347fcee96d82-kube-api-access-8hzxn\") pod \"nova-metadata-0\" (UID: \"b540109c-7b24-438a-89e0-347fcee96d82\") " pod="openstack/nova-metadata-0" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.478175 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b540109c-7b24-438a-89e0-347fcee96d82-config-data\") pod \"nova-metadata-0\" (UID: \"b540109c-7b24-438a-89e0-347fcee96d82\") " pod="openstack/nova-metadata-0" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.478297 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b540109c-7b24-438a-89e0-347fcee96d82-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b540109c-7b24-438a-89e0-347fcee96d82\") " pod="openstack/nova-metadata-0" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.478346 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b540109c-7b24-438a-89e0-347fcee96d82-logs\") pod \"nova-metadata-0\" (UID: \"b540109c-7b24-438a-89e0-347fcee96d82\") " pod="openstack/nova-metadata-0" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.579849 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hzxn\" (UniqueName: \"kubernetes.io/projected/b540109c-7b24-438a-89e0-347fcee96d82-kube-api-access-8hzxn\") pod \"nova-metadata-0\" (UID: \"b540109c-7b24-438a-89e0-347fcee96d82\") " pod="openstack/nova-metadata-0" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.579941 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b540109c-7b24-438a-89e0-347fcee96d82-config-data\") pod \"nova-metadata-0\" (UID: \"b540109c-7b24-438a-89e0-347fcee96d82\") " pod="openstack/nova-metadata-0" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.579990 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b540109c-7b24-438a-89e0-347fcee96d82-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b540109c-7b24-438a-89e0-347fcee96d82\") " pod="openstack/nova-metadata-0" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.580016 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b540109c-7b24-438a-89e0-347fcee96d82-logs\") pod \"nova-metadata-0\" (UID: \"b540109c-7b24-438a-89e0-347fcee96d82\") " pod="openstack/nova-metadata-0" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.580090 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b540109c-7b24-438a-89e0-347fcee96d82-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b540109c-7b24-438a-89e0-347fcee96d82\") " pod="openstack/nova-metadata-0" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.580553 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b540109c-7b24-438a-89e0-347fcee96d82-logs\") pod \"nova-metadata-0\" (UID: \"b540109c-7b24-438a-89e0-347fcee96d82\") " pod="openstack/nova-metadata-0" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.583793 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b540109c-7b24-438a-89e0-347fcee96d82-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b540109c-7b24-438a-89e0-347fcee96d82\") " pod="openstack/nova-metadata-0" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.584776 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b540109c-7b24-438a-89e0-347fcee96d82-config-data\") pod \"nova-metadata-0\" (UID: \"b540109c-7b24-438a-89e0-347fcee96d82\") " pod="openstack/nova-metadata-0" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.586111 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b540109c-7b24-438a-89e0-347fcee96d82-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b540109c-7b24-438a-89e0-347fcee96d82\") " pod="openstack/nova-metadata-0" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.603991 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hzxn\" (UniqueName: \"kubernetes.io/projected/b540109c-7b24-438a-89e0-347fcee96d82-kube-api-access-8hzxn\") pod \"nova-metadata-0\" (UID: \"b540109c-7b24-438a-89e0-347fcee96d82\") " pod="openstack/nova-metadata-0" Sep 30 13:59:36 crc kubenswrapper[4936]: I0930 13:59:36.732436 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:59:37 crc kubenswrapper[4936]: I0930 13:59:37.232662 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:59:37 crc kubenswrapper[4936]: W0930 13:59:37.234556 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb540109c_7b24_438a_89e0_347fcee96d82.slice/crio-5dc9cc075b9288d0237dc5b38dbb26abcfaebd2f055d8a8a9311c7e8133c532c WatchSource:0}: Error finding container 5dc9cc075b9288d0237dc5b38dbb26abcfaebd2f055d8a8a9311c7e8133c532c: Status 404 returned error can't find the container with id 5dc9cc075b9288d0237dc5b38dbb26abcfaebd2f055d8a8a9311c7e8133c532c Sep 30 13:59:37 crc kubenswrapper[4936]: I0930 13:59:37.255684 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 13:59:37 crc kubenswrapper[4936]: I0930 13:59:37.256013 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 13:59:37 crc kubenswrapper[4936]: I0930 13:59:37.614038 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 13:59:37 crc kubenswrapper[4936]: I0930 13:59:37.614084 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 13:59:37 crc kubenswrapper[4936]: I0930 13:59:37.648291 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 13:59:37 crc kubenswrapper[4936]: I0930 13:59:37.699598 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 13:59:37 crc kubenswrapper[4936]: I0930 13:59:37.718604 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 13:59:37 crc kubenswrapper[4936]: I0930 13:59:37.771065 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-f26mj"] Sep 30 13:59:37 crc kubenswrapper[4936]: I0930 13:59:37.771605 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" podUID="fe31c877-0e46-4a03-b18d-773f9487573d" containerName="dnsmasq-dns" containerID="cri-o://26f9aa92815ecdda437d00df8e078ae0c2d085547d74710fbd35c9020cc63455" gracePeriod=10 Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.072261 4936 generic.go:334] "Generic (PLEG): container finished" podID="fe31c877-0e46-4a03-b18d-773f9487573d" containerID="26f9aa92815ecdda437d00df8e078ae0c2d085547d74710fbd35c9020cc63455" exitCode=0 Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.072314 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" event={"ID":"fe31c877-0e46-4a03-b18d-773f9487573d","Type":"ContainerDied","Data":"26f9aa92815ecdda437d00df8e078ae0c2d085547d74710fbd35c9020cc63455"} Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.077585 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b540109c-7b24-438a-89e0-347fcee96d82","Type":"ContainerStarted","Data":"45268806b123a0e7a81b1031baa1abae17a225e80bde5308bd30a7782e69ca22"} Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.077630 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b540109c-7b24-438a-89e0-347fcee96d82","Type":"ContainerStarted","Data":"c753bc0d35ef2521e7fb75e83e9f5ca1a61c5f401de5705bcf9fdf9a73e5b8c4"} Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.077643 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b540109c-7b24-438a-89e0-347fcee96d82","Type":"ContainerStarted","Data":"5dc9cc075b9288d0237dc5b38dbb26abcfaebd2f055d8a8a9311c7e8133c532c"} Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.112835 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.112812738 podStartE2EDuration="2.112812738s" podCreationTimestamp="2025-09-30 13:59:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:38.102795514 +0000 UTC m=+1228.486797815" watchObservedRunningTime="2025-09-30 13:59:38.112812738 +0000 UTC m=+1228.496815039" Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.134981 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.341647 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="989a5501-f4e5-43b6-b12d-658a5f5db28b" path="/var/lib/kubelet/pods/989a5501-f4e5-43b6-b12d-658a5f5db28b/volumes" Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.355177 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="784bde60-fe66-4bf1-8cc3-0d66acb994c0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.168:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.355211 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="784bde60-fe66-4bf1-8cc3-0d66acb994c0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.168:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.490221 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.516567 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-ovsdbserver-sb\") pod \"fe31c877-0e46-4a03-b18d-773f9487573d\" (UID: \"fe31c877-0e46-4a03-b18d-773f9487573d\") " Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.516645 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-ovsdbserver-nb\") pod \"fe31c877-0e46-4a03-b18d-773f9487573d\" (UID: \"fe31c877-0e46-4a03-b18d-773f9487573d\") " Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.516685 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-config\") pod \"fe31c877-0e46-4a03-b18d-773f9487573d\" (UID: \"fe31c877-0e46-4a03-b18d-773f9487573d\") " Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.516721 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzx68\" (UniqueName: \"kubernetes.io/projected/fe31c877-0e46-4a03-b18d-773f9487573d-kube-api-access-kzx68\") pod \"fe31c877-0e46-4a03-b18d-773f9487573d\" (UID: \"fe31c877-0e46-4a03-b18d-773f9487573d\") " Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.516800 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-dns-svc\") pod \"fe31c877-0e46-4a03-b18d-773f9487573d\" (UID: \"fe31c877-0e46-4a03-b18d-773f9487573d\") " Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.579590 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe31c877-0e46-4a03-b18d-773f9487573d-kube-api-access-kzx68" (OuterVolumeSpecName: "kube-api-access-kzx68") pod "fe31c877-0e46-4a03-b18d-773f9487573d" (UID: "fe31c877-0e46-4a03-b18d-773f9487573d"). InnerVolumeSpecName "kube-api-access-kzx68". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.620789 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzx68\" (UniqueName: \"kubernetes.io/projected/fe31c877-0e46-4a03-b18d-773f9487573d-kube-api-access-kzx68\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.697251 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-config" (OuterVolumeSpecName: "config") pod "fe31c877-0e46-4a03-b18d-773f9487573d" (UID: "fe31c877-0e46-4a03-b18d-773f9487573d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.708254 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fe31c877-0e46-4a03-b18d-773f9487573d" (UID: "fe31c877-0e46-4a03-b18d-773f9487573d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.708928 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fe31c877-0e46-4a03-b18d-773f9487573d" (UID: "fe31c877-0e46-4a03-b18d-773f9487573d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.723084 4936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.723129 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.723146 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.728139 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fe31c877-0e46-4a03-b18d-773f9487573d" (UID: "fe31c877-0e46-4a03-b18d-773f9487573d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:59:38 crc kubenswrapper[4936]: I0930 13:59:38.824567 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe31c877-0e46-4a03-b18d-773f9487573d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:39 crc kubenswrapper[4936]: I0930 13:59:39.087952 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" event={"ID":"fe31c877-0e46-4a03-b18d-773f9487573d","Type":"ContainerDied","Data":"ecba3779c60bbcd65e4904c2b8e477ecdb53621add086f676870b27396068744"} Sep 30 13:59:39 crc kubenswrapper[4936]: I0930 13:59:39.088026 4936 scope.go:117] "RemoveContainer" containerID="26f9aa92815ecdda437d00df8e078ae0c2d085547d74710fbd35c9020cc63455" Sep 30 13:59:39 crc kubenswrapper[4936]: I0930 13:59:39.088068 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-f26mj" Sep 30 13:59:39 crc kubenswrapper[4936]: I0930 13:59:39.109683 4936 scope.go:117] "RemoveContainer" containerID="ccccd5c293ea5698f4da4d7a6a763675657c1839a58d32a483144ddf1e731254" Sep 30 13:59:39 crc kubenswrapper[4936]: I0930 13:59:39.127546 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-f26mj"] Sep 30 13:59:39 crc kubenswrapper[4936]: I0930 13:59:39.137112 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-f26mj"] Sep 30 13:59:40 crc kubenswrapper[4936]: I0930 13:59:40.332837 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe31c877-0e46-4a03-b18d-773f9487573d" path="/var/lib/kubelet/pods/fe31c877-0e46-4a03-b18d-773f9487573d/volumes" Sep 30 13:59:41 crc kubenswrapper[4936]: I0930 13:59:41.733278 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 13:59:41 crc kubenswrapper[4936]: I0930 13:59:41.733598 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 13:59:42 crc kubenswrapper[4936]: I0930 13:59:42.117655 4936 generic.go:334] "Generic (PLEG): container finished" podID="ca906ecd-ab9e-433a-982b-79ca504b6085" containerID="a345b11302e09c7538e7e71958d7fe5df5b465ae48da0abdada4eb967b37e2e7" exitCode=0 Sep 30 13:59:42 crc kubenswrapper[4936]: I0930 13:59:42.117700 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-j5vlj" event={"ID":"ca906ecd-ab9e-433a-982b-79ca504b6085","Type":"ContainerDied","Data":"a345b11302e09c7538e7e71958d7fe5df5b465ae48da0abdada4eb967b37e2e7"} Sep 30 13:59:43 crc kubenswrapper[4936]: I0930 13:59:43.128028 4936 generic.go:334] "Generic (PLEG): container finished" podID="fe466675-627d-41b4-910f-921c06848e48" containerID="f8a5d50b6dc48579a0b04fefd0890b3e70f3d4e0f3837f4127a0b4d9c9d096c7" exitCode=0 Sep 30 13:59:43 crc kubenswrapper[4936]: I0930 13:59:43.128118 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-n9w76" event={"ID":"fe466675-627d-41b4-910f-921c06848e48","Type":"ContainerDied","Data":"f8a5d50b6dc48579a0b04fefd0890b3e70f3d4e0f3837f4127a0b4d9c9d096c7"} Sep 30 13:59:43 crc kubenswrapper[4936]: I0930 13:59:43.458807 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-j5vlj" Sep 30 13:59:43 crc kubenswrapper[4936]: I0930 13:59:43.518670 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca906ecd-ab9e-433a-982b-79ca504b6085-scripts\") pod \"ca906ecd-ab9e-433a-982b-79ca504b6085\" (UID: \"ca906ecd-ab9e-433a-982b-79ca504b6085\") " Sep 30 13:59:43 crc kubenswrapper[4936]: I0930 13:59:43.518824 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca906ecd-ab9e-433a-982b-79ca504b6085-combined-ca-bundle\") pod \"ca906ecd-ab9e-433a-982b-79ca504b6085\" (UID: \"ca906ecd-ab9e-433a-982b-79ca504b6085\") " Sep 30 13:59:43 crc kubenswrapper[4936]: I0930 13:59:43.518858 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca906ecd-ab9e-433a-982b-79ca504b6085-config-data\") pod \"ca906ecd-ab9e-433a-982b-79ca504b6085\" (UID: \"ca906ecd-ab9e-433a-982b-79ca504b6085\") " Sep 30 13:59:43 crc kubenswrapper[4936]: I0930 13:59:43.518978 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p776r\" (UniqueName: \"kubernetes.io/projected/ca906ecd-ab9e-433a-982b-79ca504b6085-kube-api-access-p776r\") pod \"ca906ecd-ab9e-433a-982b-79ca504b6085\" (UID: \"ca906ecd-ab9e-433a-982b-79ca504b6085\") " Sep 30 13:59:43 crc kubenswrapper[4936]: I0930 13:59:43.525164 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca906ecd-ab9e-433a-982b-79ca504b6085-scripts" (OuterVolumeSpecName: "scripts") pod "ca906ecd-ab9e-433a-982b-79ca504b6085" (UID: "ca906ecd-ab9e-433a-982b-79ca504b6085"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:43 crc kubenswrapper[4936]: I0930 13:59:43.534635 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca906ecd-ab9e-433a-982b-79ca504b6085-kube-api-access-p776r" (OuterVolumeSpecName: "kube-api-access-p776r") pod "ca906ecd-ab9e-433a-982b-79ca504b6085" (UID: "ca906ecd-ab9e-433a-982b-79ca504b6085"). InnerVolumeSpecName "kube-api-access-p776r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:59:43 crc kubenswrapper[4936]: I0930 13:59:43.545720 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca906ecd-ab9e-433a-982b-79ca504b6085-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca906ecd-ab9e-433a-982b-79ca504b6085" (UID: "ca906ecd-ab9e-433a-982b-79ca504b6085"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:43 crc kubenswrapper[4936]: I0930 13:59:43.548258 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca906ecd-ab9e-433a-982b-79ca504b6085-config-data" (OuterVolumeSpecName: "config-data") pod "ca906ecd-ab9e-433a-982b-79ca504b6085" (UID: "ca906ecd-ab9e-433a-982b-79ca504b6085"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:43 crc kubenswrapper[4936]: I0930 13:59:43.621068 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca906ecd-ab9e-433a-982b-79ca504b6085-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:43 crc kubenswrapper[4936]: I0930 13:59:43.621100 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca906ecd-ab9e-433a-982b-79ca504b6085-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:43 crc kubenswrapper[4936]: I0930 13:59:43.621110 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p776r\" (UniqueName: \"kubernetes.io/projected/ca906ecd-ab9e-433a-982b-79ca504b6085-kube-api-access-p776r\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:43 crc kubenswrapper[4936]: I0930 13:59:43.621119 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca906ecd-ab9e-433a-982b-79ca504b6085-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.164644 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-j5vlj" Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.165696 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-j5vlj" event={"ID":"ca906ecd-ab9e-433a-982b-79ca504b6085","Type":"ContainerDied","Data":"456fac78fa7d79851288d5df9f29fa9942d20d31c19b22390a9f40d6606a651c"} Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.165783 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="456fac78fa7d79851288d5df9f29fa9942d20d31c19b22390a9f40d6606a651c" Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.310520 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.310953 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="784bde60-fe66-4bf1-8cc3-0d66acb994c0" containerName="nova-api-api" containerID="cri-o://00a77bee2aee051d57d7e2f7efdabdd1eb703226c83e80e69f3c8ef3eea6cc42" gracePeriod=30 Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.310831 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="784bde60-fe66-4bf1-8cc3-0d66acb994c0" containerName="nova-api-log" containerID="cri-o://ffc485bfbaf04068e74b6a544e2d50068aeb47e01d668b5a114203236d02cee1" gracePeriod=30 Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.359574 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.359764 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="40308063-f8c5-4aae-99be-736964f4ec37" containerName="nova-scheduler-scheduler" containerID="cri-o://92312ed8d1b3f07c2c6914605cf2cae646136c280584f97e843607032f86fb6f" gracePeriod=30 Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.386041 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.386297 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b540109c-7b24-438a-89e0-347fcee96d82" containerName="nova-metadata-log" containerID="cri-o://c753bc0d35ef2521e7fb75e83e9f5ca1a61c5f401de5705bcf9fdf9a73e5b8c4" gracePeriod=30 Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.386840 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b540109c-7b24-438a-89e0-347fcee96d82" containerName="nova-metadata-metadata" containerID="cri-o://45268806b123a0e7a81b1031baa1abae17a225e80bde5308bd30a7782e69ca22" gracePeriod=30 Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.612653 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-n9w76" Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.641371 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe466675-627d-41b4-910f-921c06848e48-config-data\") pod \"fe466675-627d-41b4-910f-921c06848e48\" (UID: \"fe466675-627d-41b4-910f-921c06848e48\") " Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.641477 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe466675-627d-41b4-910f-921c06848e48-combined-ca-bundle\") pod \"fe466675-627d-41b4-910f-921c06848e48\" (UID: \"fe466675-627d-41b4-910f-921c06848e48\") " Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.641601 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hdzc\" (UniqueName: \"kubernetes.io/projected/fe466675-627d-41b4-910f-921c06848e48-kube-api-access-6hdzc\") pod \"fe466675-627d-41b4-910f-921c06848e48\" (UID: \"fe466675-627d-41b4-910f-921c06848e48\") " Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.641666 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe466675-627d-41b4-910f-921c06848e48-scripts\") pod \"fe466675-627d-41b4-910f-921c06848e48\" (UID: \"fe466675-627d-41b4-910f-921c06848e48\") " Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.648724 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe466675-627d-41b4-910f-921c06848e48-kube-api-access-6hdzc" (OuterVolumeSpecName: "kube-api-access-6hdzc") pod "fe466675-627d-41b4-910f-921c06848e48" (UID: "fe466675-627d-41b4-910f-921c06848e48"). InnerVolumeSpecName "kube-api-access-6hdzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.683879 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe466675-627d-41b4-910f-921c06848e48-config-data" (OuterVolumeSpecName: "config-data") pod "fe466675-627d-41b4-910f-921c06848e48" (UID: "fe466675-627d-41b4-910f-921c06848e48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.696035 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe466675-627d-41b4-910f-921c06848e48-scripts" (OuterVolumeSpecName: "scripts") pod "fe466675-627d-41b4-910f-921c06848e48" (UID: "fe466675-627d-41b4-910f-921c06848e48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.701574 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe466675-627d-41b4-910f-921c06848e48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe466675-627d-41b4-910f-921c06848e48" (UID: "fe466675-627d-41b4-910f-921c06848e48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.744756 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe466675-627d-41b4-910f-921c06848e48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.744801 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hdzc\" (UniqueName: \"kubernetes.io/projected/fe466675-627d-41b4-910f-921c06848e48-kube-api-access-6hdzc\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.744818 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe466675-627d-41b4-910f-921c06848e48-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.744831 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe466675-627d-41b4-910f-921c06848e48-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.938999 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.947548 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b540109c-7b24-438a-89e0-347fcee96d82-logs\") pod \"b540109c-7b24-438a-89e0-347fcee96d82\" (UID: \"b540109c-7b24-438a-89e0-347fcee96d82\") " Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.947885 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b540109c-7b24-438a-89e0-347fcee96d82-logs" (OuterVolumeSpecName: "logs") pod "b540109c-7b24-438a-89e0-347fcee96d82" (UID: "b540109c-7b24-438a-89e0-347fcee96d82"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.947993 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b540109c-7b24-438a-89e0-347fcee96d82-nova-metadata-tls-certs\") pod \"b540109c-7b24-438a-89e0-347fcee96d82\" (UID: \"b540109c-7b24-438a-89e0-347fcee96d82\") " Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.948130 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b540109c-7b24-438a-89e0-347fcee96d82-config-data\") pod \"b540109c-7b24-438a-89e0-347fcee96d82\" (UID: \"b540109c-7b24-438a-89e0-347fcee96d82\") " Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.948282 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hzxn\" (UniqueName: \"kubernetes.io/projected/b540109c-7b24-438a-89e0-347fcee96d82-kube-api-access-8hzxn\") pod \"b540109c-7b24-438a-89e0-347fcee96d82\" (UID: \"b540109c-7b24-438a-89e0-347fcee96d82\") " Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.948474 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b540109c-7b24-438a-89e0-347fcee96d82-combined-ca-bundle\") pod \"b540109c-7b24-438a-89e0-347fcee96d82\" (UID: \"b540109c-7b24-438a-89e0-347fcee96d82\") " Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.948958 4936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b540109c-7b24-438a-89e0-347fcee96d82-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.956688 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b540109c-7b24-438a-89e0-347fcee96d82-kube-api-access-8hzxn" (OuterVolumeSpecName: "kube-api-access-8hzxn") pod "b540109c-7b24-438a-89e0-347fcee96d82" (UID: "b540109c-7b24-438a-89e0-347fcee96d82"). InnerVolumeSpecName "kube-api-access-8hzxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:59:44 crc kubenswrapper[4936]: I0930 13:59:44.980478 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b540109c-7b24-438a-89e0-347fcee96d82-config-data" (OuterVolumeSpecName: "config-data") pod "b540109c-7b24-438a-89e0-347fcee96d82" (UID: "b540109c-7b24-438a-89e0-347fcee96d82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.016407 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b540109c-7b24-438a-89e0-347fcee96d82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b540109c-7b24-438a-89e0-347fcee96d82" (UID: "b540109c-7b24-438a-89e0-347fcee96d82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.027904 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b540109c-7b24-438a-89e0-347fcee96d82-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b540109c-7b24-438a-89e0-347fcee96d82" (UID: "b540109c-7b24-438a-89e0-347fcee96d82"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.050774 4936 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b540109c-7b24-438a-89e0-347fcee96d82-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.050810 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b540109c-7b24-438a-89e0-347fcee96d82-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.050820 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hzxn\" (UniqueName: \"kubernetes.io/projected/b540109c-7b24-438a-89e0-347fcee96d82-kube-api-access-8hzxn\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.050832 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b540109c-7b24-438a-89e0-347fcee96d82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.180244 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-n9w76" event={"ID":"fe466675-627d-41b4-910f-921c06848e48","Type":"ContainerDied","Data":"fa017b5ebec8ff362dfef01be333fa42a013e4e0f2ad97320f31c739cbca9a93"} Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.180304 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa017b5ebec8ff362dfef01be333fa42a013e4e0f2ad97320f31c739cbca9a93" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.180259 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-n9w76" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.184360 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"784bde60-fe66-4bf1-8cc3-0d66acb994c0","Type":"ContainerDied","Data":"ffc485bfbaf04068e74b6a544e2d50068aeb47e01d668b5a114203236d02cee1"} Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.184319 4936 generic.go:334] "Generic (PLEG): container finished" podID="784bde60-fe66-4bf1-8cc3-0d66acb994c0" containerID="ffc485bfbaf04068e74b6a544e2d50068aeb47e01d668b5a114203236d02cee1" exitCode=143 Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.186947 4936 generic.go:334] "Generic (PLEG): container finished" podID="b540109c-7b24-438a-89e0-347fcee96d82" containerID="45268806b123a0e7a81b1031baa1abae17a225e80bde5308bd30a7782e69ca22" exitCode=0 Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.187046 4936 generic.go:334] "Generic (PLEG): container finished" podID="b540109c-7b24-438a-89e0-347fcee96d82" containerID="c753bc0d35ef2521e7fb75e83e9f5ca1a61c5f401de5705bcf9fdf9a73e5b8c4" exitCode=143 Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.187045 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.187013 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b540109c-7b24-438a-89e0-347fcee96d82","Type":"ContainerDied","Data":"45268806b123a0e7a81b1031baa1abae17a225e80bde5308bd30a7782e69ca22"} Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.187423 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b540109c-7b24-438a-89e0-347fcee96d82","Type":"ContainerDied","Data":"c753bc0d35ef2521e7fb75e83e9f5ca1a61c5f401de5705bcf9fdf9a73e5b8c4"} Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.187442 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b540109c-7b24-438a-89e0-347fcee96d82","Type":"ContainerDied","Data":"5dc9cc075b9288d0237dc5b38dbb26abcfaebd2f055d8a8a9311c7e8133c532c"} Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.187476 4936 scope.go:117] "RemoveContainer" containerID="45268806b123a0e7a81b1031baa1abae17a225e80bde5308bd30a7782e69ca22" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.230703 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 13:59:45 crc kubenswrapper[4936]: E0930 13:59:45.231726 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe466675-627d-41b4-910f-921c06848e48" containerName="nova-cell1-conductor-db-sync" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.231803 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe466675-627d-41b4-910f-921c06848e48" containerName="nova-cell1-conductor-db-sync" Sep 30 13:59:45 crc kubenswrapper[4936]: E0930 13:59:45.231910 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca906ecd-ab9e-433a-982b-79ca504b6085" containerName="nova-manage" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.231979 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca906ecd-ab9e-433a-982b-79ca504b6085" containerName="nova-manage" Sep 30 13:59:45 crc kubenswrapper[4936]: E0930 13:59:45.232037 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe31c877-0e46-4a03-b18d-773f9487573d" containerName="init" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.232087 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe31c877-0e46-4a03-b18d-773f9487573d" containerName="init" Sep 30 13:59:45 crc kubenswrapper[4936]: E0930 13:59:45.232147 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe31c877-0e46-4a03-b18d-773f9487573d" containerName="dnsmasq-dns" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.232196 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe31c877-0e46-4a03-b18d-773f9487573d" containerName="dnsmasq-dns" Sep 30 13:59:45 crc kubenswrapper[4936]: E0930 13:59:45.232252 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b540109c-7b24-438a-89e0-347fcee96d82" containerName="nova-metadata-metadata" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.232308 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b540109c-7b24-438a-89e0-347fcee96d82" containerName="nova-metadata-metadata" Sep 30 13:59:45 crc kubenswrapper[4936]: E0930 13:59:45.232628 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b540109c-7b24-438a-89e0-347fcee96d82" containerName="nova-metadata-log" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.232697 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b540109c-7b24-438a-89e0-347fcee96d82" containerName="nova-metadata-log" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.232918 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b540109c-7b24-438a-89e0-347fcee96d82" containerName="nova-metadata-metadata" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.232982 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe31c877-0e46-4a03-b18d-773f9487573d" containerName="dnsmasq-dns" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.233040 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b540109c-7b24-438a-89e0-347fcee96d82" containerName="nova-metadata-log" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.233094 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe466675-627d-41b4-910f-921c06848e48" containerName="nova-cell1-conductor-db-sync" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.233164 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca906ecd-ab9e-433a-982b-79ca504b6085" containerName="nova-manage" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.233949 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.239525 4936 scope.go:117] "RemoveContainer" containerID="c753bc0d35ef2521e7fb75e83e9f5ca1a61c5f401de5705bcf9fdf9a73e5b8c4" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.239851 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.256934 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ca401b-9423-44d6-a87a-b1d5cc37b381-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d9ca401b-9423-44d6-a87a-b1d5cc37b381\") " pod="openstack/nova-cell1-conductor-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.257009 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ca401b-9423-44d6-a87a-b1d5cc37b381-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d9ca401b-9423-44d6-a87a-b1d5cc37b381\") " pod="openstack/nova-cell1-conductor-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.257049 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbpvf\" (UniqueName: \"kubernetes.io/projected/d9ca401b-9423-44d6-a87a-b1d5cc37b381-kube-api-access-nbpvf\") pod \"nova-cell1-conductor-0\" (UID: \"d9ca401b-9423-44d6-a87a-b1d5cc37b381\") " pod="openstack/nova-cell1-conductor-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.283098 4936 scope.go:117] "RemoveContainer" containerID="45268806b123a0e7a81b1031baa1abae17a225e80bde5308bd30a7782e69ca22" Sep 30 13:59:45 crc kubenswrapper[4936]: E0930 13:59:45.284404 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45268806b123a0e7a81b1031baa1abae17a225e80bde5308bd30a7782e69ca22\": container with ID starting with 45268806b123a0e7a81b1031baa1abae17a225e80bde5308bd30a7782e69ca22 not found: ID does not exist" containerID="45268806b123a0e7a81b1031baa1abae17a225e80bde5308bd30a7782e69ca22" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.293405 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45268806b123a0e7a81b1031baa1abae17a225e80bde5308bd30a7782e69ca22"} err="failed to get container status \"45268806b123a0e7a81b1031baa1abae17a225e80bde5308bd30a7782e69ca22\": rpc error: code = NotFound desc = could not find container \"45268806b123a0e7a81b1031baa1abae17a225e80bde5308bd30a7782e69ca22\": container with ID starting with 45268806b123a0e7a81b1031baa1abae17a225e80bde5308bd30a7782e69ca22 not found: ID does not exist" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.293756 4936 scope.go:117] "RemoveContainer" containerID="c753bc0d35ef2521e7fb75e83e9f5ca1a61c5f401de5705bcf9fdf9a73e5b8c4" Sep 30 13:59:45 crc kubenswrapper[4936]: E0930 13:59:45.298834 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c753bc0d35ef2521e7fb75e83e9f5ca1a61c5f401de5705bcf9fdf9a73e5b8c4\": container with ID starting with c753bc0d35ef2521e7fb75e83e9f5ca1a61c5f401de5705bcf9fdf9a73e5b8c4 not found: ID does not exist" containerID="c753bc0d35ef2521e7fb75e83e9f5ca1a61c5f401de5705bcf9fdf9a73e5b8c4" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.298986 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c753bc0d35ef2521e7fb75e83e9f5ca1a61c5f401de5705bcf9fdf9a73e5b8c4"} err="failed to get container status \"c753bc0d35ef2521e7fb75e83e9f5ca1a61c5f401de5705bcf9fdf9a73e5b8c4\": rpc error: code = NotFound desc = could not find container \"c753bc0d35ef2521e7fb75e83e9f5ca1a61c5f401de5705bcf9fdf9a73e5b8c4\": container with ID starting with c753bc0d35ef2521e7fb75e83e9f5ca1a61c5f401de5705bcf9fdf9a73e5b8c4 not found: ID does not exist" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.299063 4936 scope.go:117] "RemoveContainer" containerID="45268806b123a0e7a81b1031baa1abae17a225e80bde5308bd30a7782e69ca22" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.300911 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45268806b123a0e7a81b1031baa1abae17a225e80bde5308bd30a7782e69ca22"} err="failed to get container status \"45268806b123a0e7a81b1031baa1abae17a225e80bde5308bd30a7782e69ca22\": rpc error: code = NotFound desc = could not find container \"45268806b123a0e7a81b1031baa1abae17a225e80bde5308bd30a7782e69ca22\": container with ID starting with 45268806b123a0e7a81b1031baa1abae17a225e80bde5308bd30a7782e69ca22 not found: ID does not exist" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.300963 4936 scope.go:117] "RemoveContainer" containerID="c753bc0d35ef2521e7fb75e83e9f5ca1a61c5f401de5705bcf9fdf9a73e5b8c4" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.301491 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c753bc0d35ef2521e7fb75e83e9f5ca1a61c5f401de5705bcf9fdf9a73e5b8c4"} err="failed to get container status \"c753bc0d35ef2521e7fb75e83e9f5ca1a61c5f401de5705bcf9fdf9a73e5b8c4\": rpc error: code = NotFound desc = could not find container \"c753bc0d35ef2521e7fb75e83e9f5ca1a61c5f401de5705bcf9fdf9a73e5b8c4\": container with ID starting with c753bc0d35ef2521e7fb75e83e9f5ca1a61c5f401de5705bcf9fdf9a73e5b8c4 not found: ID does not exist" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.304536 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.321942 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.337275 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.340052 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.342616 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.343715 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.355009 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.359614 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/438cb977-993f-4bd0-b3d4-a1b81ecbadff-logs\") pod \"nova-metadata-0\" (UID: \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\") " pod="openstack/nova-metadata-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.359823 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45mfv\" (UniqueName: \"kubernetes.io/projected/438cb977-993f-4bd0-b3d4-a1b81ecbadff-kube-api-access-45mfv\") pod \"nova-metadata-0\" (UID: \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\") " pod="openstack/nova-metadata-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.359897 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438cb977-993f-4bd0-b3d4-a1b81ecbadff-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\") " pod="openstack/nova-metadata-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.359996 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ca401b-9423-44d6-a87a-b1d5cc37b381-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d9ca401b-9423-44d6-a87a-b1d5cc37b381\") " pod="openstack/nova-cell1-conductor-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.360126 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ca401b-9423-44d6-a87a-b1d5cc37b381-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d9ca401b-9423-44d6-a87a-b1d5cc37b381\") " pod="openstack/nova-cell1-conductor-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.360242 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbpvf\" (UniqueName: \"kubernetes.io/projected/d9ca401b-9423-44d6-a87a-b1d5cc37b381-kube-api-access-nbpvf\") pod \"nova-cell1-conductor-0\" (UID: \"d9ca401b-9423-44d6-a87a-b1d5cc37b381\") " pod="openstack/nova-cell1-conductor-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.360454 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/438cb977-993f-4bd0-b3d4-a1b81ecbadff-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\") " pod="openstack/nova-metadata-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.360539 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438cb977-993f-4bd0-b3d4-a1b81ecbadff-config-data\") pod \"nova-metadata-0\" (UID: \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\") " pod="openstack/nova-metadata-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.369074 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ca401b-9423-44d6-a87a-b1d5cc37b381-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d9ca401b-9423-44d6-a87a-b1d5cc37b381\") " pod="openstack/nova-cell1-conductor-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.370999 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ca401b-9423-44d6-a87a-b1d5cc37b381-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d9ca401b-9423-44d6-a87a-b1d5cc37b381\") " pod="openstack/nova-cell1-conductor-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.374246 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.382130 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbpvf\" (UniqueName: \"kubernetes.io/projected/d9ca401b-9423-44d6-a87a-b1d5cc37b381-kube-api-access-nbpvf\") pod \"nova-cell1-conductor-0\" (UID: \"d9ca401b-9423-44d6-a87a-b1d5cc37b381\") " pod="openstack/nova-cell1-conductor-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.462110 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/438cb977-993f-4bd0-b3d4-a1b81ecbadff-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\") " pod="openstack/nova-metadata-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.462188 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438cb977-993f-4bd0-b3d4-a1b81ecbadff-config-data\") pod \"nova-metadata-0\" (UID: \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\") " pod="openstack/nova-metadata-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.462237 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/438cb977-993f-4bd0-b3d4-a1b81ecbadff-logs\") pod \"nova-metadata-0\" (UID: \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\") " pod="openstack/nova-metadata-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.462290 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45mfv\" (UniqueName: \"kubernetes.io/projected/438cb977-993f-4bd0-b3d4-a1b81ecbadff-kube-api-access-45mfv\") pod \"nova-metadata-0\" (UID: \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\") " pod="openstack/nova-metadata-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.462378 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438cb977-993f-4bd0-b3d4-a1b81ecbadff-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\") " pod="openstack/nova-metadata-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.465668 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438cb977-993f-4bd0-b3d4-a1b81ecbadff-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\") " pod="openstack/nova-metadata-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.465838 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/438cb977-993f-4bd0-b3d4-a1b81ecbadff-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\") " pod="openstack/nova-metadata-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.466059 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/438cb977-993f-4bd0-b3d4-a1b81ecbadff-logs\") pod \"nova-metadata-0\" (UID: \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\") " pod="openstack/nova-metadata-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.470811 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438cb977-993f-4bd0-b3d4-a1b81ecbadff-config-data\") pod \"nova-metadata-0\" (UID: \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\") " pod="openstack/nova-metadata-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.488976 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45mfv\" (UniqueName: \"kubernetes.io/projected/438cb977-993f-4bd0-b3d4-a1b81ecbadff-kube-api-access-45mfv\") pod \"nova-metadata-0\" (UID: \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\") " pod="openstack/nova-metadata-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.580171 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 13:59:45 crc kubenswrapper[4936]: I0930 13:59:45.665600 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 13:59:46 crc kubenswrapper[4936]: I0930 13:59:46.082254 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 13:59:46 crc kubenswrapper[4936]: W0930 13:59:46.082473 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9ca401b_9423_44d6_a87a_b1d5cc37b381.slice/crio-31c5aa0945d06a705af477a919b318a856537bb030e14dffc7f2b4e1fa90d101 WatchSource:0}: Error finding container 31c5aa0945d06a705af477a919b318a856537bb030e14dffc7f2b4e1fa90d101: Status 404 returned error can't find the container with id 31c5aa0945d06a705af477a919b318a856537bb030e14dffc7f2b4e1fa90d101 Sep 30 13:59:46 crc kubenswrapper[4936]: W0930 13:59:46.198995 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod438cb977_993f_4bd0_b3d4_a1b81ecbadff.slice/crio-533015542cdc636a99c79aa08b0039e0a5a7e9fe8c2ebd8cb11864771e5567ef WatchSource:0}: Error finding container 533015542cdc636a99c79aa08b0039e0a5a7e9fe8c2ebd8cb11864771e5567ef: Status 404 returned error can't find the container with id 533015542cdc636a99c79aa08b0039e0a5a7e9fe8c2ebd8cb11864771e5567ef Sep 30 13:59:46 crc kubenswrapper[4936]: I0930 13:59:46.199241 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d9ca401b-9423-44d6-a87a-b1d5cc37b381","Type":"ContainerStarted","Data":"31c5aa0945d06a705af477a919b318a856537bb030e14dffc7f2b4e1fa90d101"} Sep 30 13:59:46 crc kubenswrapper[4936]: I0930 13:59:46.204618 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 13:59:46 crc kubenswrapper[4936]: I0930 13:59:46.327808 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b540109c-7b24-438a-89e0-347fcee96d82" path="/var/lib/kubelet/pods/b540109c-7b24-438a-89e0-347fcee96d82/volumes" Sep 30 13:59:47 crc kubenswrapper[4936]: I0930 13:59:47.208525 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"438cb977-993f-4bd0-b3d4-a1b81ecbadff","Type":"ContainerStarted","Data":"582bfbfb80aa86b162217be00f0e4259b15c0f020f71ee10b638e3179e341b18"} Sep 30 13:59:47 crc kubenswrapper[4936]: I0930 13:59:47.208798 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"438cb977-993f-4bd0-b3d4-a1b81ecbadff","Type":"ContainerStarted","Data":"a8e546b6492c845d35fc4ff5a0ab574b0e618bf6a29013e0d9d7d707f008bf67"} Sep 30 13:59:47 crc kubenswrapper[4936]: I0930 13:59:47.208812 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"438cb977-993f-4bd0-b3d4-a1b81ecbadff","Type":"ContainerStarted","Data":"533015542cdc636a99c79aa08b0039e0a5a7e9fe8c2ebd8cb11864771e5567ef"} Sep 30 13:59:47 crc kubenswrapper[4936]: I0930 13:59:47.210170 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d9ca401b-9423-44d6-a87a-b1d5cc37b381","Type":"ContainerStarted","Data":"5e2bbe135ea69d3c359496de01b790862fa492dd49e1bafafd5d2ada2d25c55f"} Sep 30 13:59:47 crc kubenswrapper[4936]: I0930 13:59:47.210324 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Sep 30 13:59:47 crc kubenswrapper[4936]: I0930 13:59:47.233893 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.233871046 podStartE2EDuration="2.233871046s" podCreationTimestamp="2025-09-30 13:59:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:47.227195814 +0000 UTC m=+1237.611198115" watchObservedRunningTime="2025-09-30 13:59:47.233871046 +0000 UTC m=+1237.617873347" Sep 30 13:59:47 crc kubenswrapper[4936]: I0930 13:59:47.255848 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.255827366 podStartE2EDuration="2.255827366s" podCreationTimestamp="2025-09-30 13:59:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:47.250615984 +0000 UTC m=+1237.634618285" watchObservedRunningTime="2025-09-30 13:59:47.255827366 +0000 UTC m=+1237.639829667" Sep 30 13:59:47 crc kubenswrapper[4936]: E0930 13:59:47.615568 4936 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92312ed8d1b3f07c2c6914605cf2cae646136c280584f97e843607032f86fb6f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 13:59:47 crc kubenswrapper[4936]: E0930 13:59:47.616620 4936 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92312ed8d1b3f07c2c6914605cf2cae646136c280584f97e843607032f86fb6f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 13:59:47 crc kubenswrapper[4936]: E0930 13:59:47.617981 4936 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92312ed8d1b3f07c2c6914605cf2cae646136c280584f97e843607032f86fb6f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 13:59:47 crc kubenswrapper[4936]: E0930 13:59:47.618017 4936 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="40308063-f8c5-4aae-99be-736964f4ec37" containerName="nova-scheduler-scheduler" Sep 30 13:59:47 crc kubenswrapper[4936]: I0930 13:59:47.867221 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:59:47 crc kubenswrapper[4936]: I0930 13:59:47.906083 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/784bde60-fe66-4bf1-8cc3-0d66acb994c0-combined-ca-bundle\") pod \"784bde60-fe66-4bf1-8cc3-0d66acb994c0\" (UID: \"784bde60-fe66-4bf1-8cc3-0d66acb994c0\") " Sep 30 13:59:47 crc kubenswrapper[4936]: I0930 13:59:47.906134 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/784bde60-fe66-4bf1-8cc3-0d66acb994c0-config-data\") pod \"784bde60-fe66-4bf1-8cc3-0d66acb994c0\" (UID: \"784bde60-fe66-4bf1-8cc3-0d66acb994c0\") " Sep 30 13:59:47 crc kubenswrapper[4936]: I0930 13:59:47.906184 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc9dl\" (UniqueName: \"kubernetes.io/projected/784bde60-fe66-4bf1-8cc3-0d66acb994c0-kube-api-access-nc9dl\") pod \"784bde60-fe66-4bf1-8cc3-0d66acb994c0\" (UID: \"784bde60-fe66-4bf1-8cc3-0d66acb994c0\") " Sep 30 13:59:47 crc kubenswrapper[4936]: I0930 13:59:47.906238 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/784bde60-fe66-4bf1-8cc3-0d66acb994c0-logs\") pod \"784bde60-fe66-4bf1-8cc3-0d66acb994c0\" (UID: \"784bde60-fe66-4bf1-8cc3-0d66acb994c0\") " Sep 30 13:59:47 crc kubenswrapper[4936]: I0930 13:59:47.907006 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/784bde60-fe66-4bf1-8cc3-0d66acb994c0-logs" (OuterVolumeSpecName: "logs") pod "784bde60-fe66-4bf1-8cc3-0d66acb994c0" (UID: "784bde60-fe66-4bf1-8cc3-0d66acb994c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:59:47 crc kubenswrapper[4936]: I0930 13:59:47.923182 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/784bde60-fe66-4bf1-8cc3-0d66acb994c0-kube-api-access-nc9dl" (OuterVolumeSpecName: "kube-api-access-nc9dl") pod "784bde60-fe66-4bf1-8cc3-0d66acb994c0" (UID: "784bde60-fe66-4bf1-8cc3-0d66acb994c0"). InnerVolumeSpecName "kube-api-access-nc9dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:59:47 crc kubenswrapper[4936]: I0930 13:59:47.944562 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/784bde60-fe66-4bf1-8cc3-0d66acb994c0-config-data" (OuterVolumeSpecName: "config-data") pod "784bde60-fe66-4bf1-8cc3-0d66acb994c0" (UID: "784bde60-fe66-4bf1-8cc3-0d66acb994c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:47 crc kubenswrapper[4936]: I0930 13:59:47.959591 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/784bde60-fe66-4bf1-8cc3-0d66acb994c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "784bde60-fe66-4bf1-8cc3-0d66acb994c0" (UID: "784bde60-fe66-4bf1-8cc3-0d66acb994c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.008393 4936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/784bde60-fe66-4bf1-8cc3-0d66acb994c0-logs\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.008433 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/784bde60-fe66-4bf1-8cc3-0d66acb994c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.008446 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/784bde60-fe66-4bf1-8cc3-0d66acb994c0-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.008455 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc9dl\" (UniqueName: \"kubernetes.io/projected/784bde60-fe66-4bf1-8cc3-0d66acb994c0-kube-api-access-nc9dl\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.220163 4936 generic.go:334] "Generic (PLEG): container finished" podID="784bde60-fe66-4bf1-8cc3-0d66acb994c0" containerID="00a77bee2aee051d57d7e2f7efdabdd1eb703226c83e80e69f3c8ef3eea6cc42" exitCode=0 Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.220470 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"784bde60-fe66-4bf1-8cc3-0d66acb994c0","Type":"ContainerDied","Data":"00a77bee2aee051d57d7e2f7efdabdd1eb703226c83e80e69f3c8ef3eea6cc42"} Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.220502 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"784bde60-fe66-4bf1-8cc3-0d66acb994c0","Type":"ContainerDied","Data":"b6b7d920228ef51657c26cd39f0500d2add8b7b4b4bef0ca843fab08bf98b3ad"} Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.220517 4936 scope.go:117] "RemoveContainer" containerID="00a77bee2aee051d57d7e2f7efdabdd1eb703226c83e80e69f3c8ef3eea6cc42" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.220599 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.300908 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.332155 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.353730 4936 scope.go:117] "RemoveContainer" containerID="ffc485bfbaf04068e74b6a544e2d50068aeb47e01d668b5a114203236d02cee1" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.355767 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 13:59:48 crc kubenswrapper[4936]: E0930 13:59:48.356201 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784bde60-fe66-4bf1-8cc3-0d66acb994c0" containerName="nova-api-log" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.356216 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="784bde60-fe66-4bf1-8cc3-0d66acb994c0" containerName="nova-api-log" Sep 30 13:59:48 crc kubenswrapper[4936]: E0930 13:59:48.356233 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784bde60-fe66-4bf1-8cc3-0d66acb994c0" containerName="nova-api-api" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.356242 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="784bde60-fe66-4bf1-8cc3-0d66acb994c0" containerName="nova-api-api" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.356491 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="784bde60-fe66-4bf1-8cc3-0d66acb994c0" containerName="nova-api-log" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.356512 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="784bde60-fe66-4bf1-8cc3-0d66acb994c0" containerName="nova-api-api" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.360400 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.365095 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.369345 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.402687 4936 scope.go:117] "RemoveContainer" containerID="00a77bee2aee051d57d7e2f7efdabdd1eb703226c83e80e69f3c8ef3eea6cc42" Sep 30 13:59:48 crc kubenswrapper[4936]: E0930 13:59:48.404483 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00a77bee2aee051d57d7e2f7efdabdd1eb703226c83e80e69f3c8ef3eea6cc42\": container with ID starting with 00a77bee2aee051d57d7e2f7efdabdd1eb703226c83e80e69f3c8ef3eea6cc42 not found: ID does not exist" containerID="00a77bee2aee051d57d7e2f7efdabdd1eb703226c83e80e69f3c8ef3eea6cc42" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.404545 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00a77bee2aee051d57d7e2f7efdabdd1eb703226c83e80e69f3c8ef3eea6cc42"} err="failed to get container status \"00a77bee2aee051d57d7e2f7efdabdd1eb703226c83e80e69f3c8ef3eea6cc42\": rpc error: code = NotFound desc = could not find container \"00a77bee2aee051d57d7e2f7efdabdd1eb703226c83e80e69f3c8ef3eea6cc42\": container with ID starting with 00a77bee2aee051d57d7e2f7efdabdd1eb703226c83e80e69f3c8ef3eea6cc42 not found: ID does not exist" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.404613 4936 scope.go:117] "RemoveContainer" containerID="ffc485bfbaf04068e74b6a544e2d50068aeb47e01d668b5a114203236d02cee1" Sep 30 13:59:48 crc kubenswrapper[4936]: E0930 13:59:48.405325 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc485bfbaf04068e74b6a544e2d50068aeb47e01d668b5a114203236d02cee1\": container with ID starting with ffc485bfbaf04068e74b6a544e2d50068aeb47e01d668b5a114203236d02cee1 not found: ID does not exist" containerID="ffc485bfbaf04068e74b6a544e2d50068aeb47e01d668b5a114203236d02cee1" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.405390 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc485bfbaf04068e74b6a544e2d50068aeb47e01d668b5a114203236d02cee1"} err="failed to get container status \"ffc485bfbaf04068e74b6a544e2d50068aeb47e01d668b5a114203236d02cee1\": rpc error: code = NotFound desc = could not find container \"ffc485bfbaf04068e74b6a544e2d50068aeb47e01d668b5a114203236d02cee1\": container with ID starting with ffc485bfbaf04068e74b6a544e2d50068aeb47e01d668b5a114203236d02cee1 not found: ID does not exist" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.414606 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dj9p\" (UniqueName: \"kubernetes.io/projected/d31fd98d-f23c-45f4-88c2-0355b99d0114-kube-api-access-7dj9p\") pod \"nova-api-0\" (UID: \"d31fd98d-f23c-45f4-88c2-0355b99d0114\") " pod="openstack/nova-api-0" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.414670 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d31fd98d-f23c-45f4-88c2-0355b99d0114-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d31fd98d-f23c-45f4-88c2-0355b99d0114\") " pod="openstack/nova-api-0" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.414703 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d31fd98d-f23c-45f4-88c2-0355b99d0114-config-data\") pod \"nova-api-0\" (UID: \"d31fd98d-f23c-45f4-88c2-0355b99d0114\") " pod="openstack/nova-api-0" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.414779 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d31fd98d-f23c-45f4-88c2-0355b99d0114-logs\") pod \"nova-api-0\" (UID: \"d31fd98d-f23c-45f4-88c2-0355b99d0114\") " pod="openstack/nova-api-0" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.516714 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d31fd98d-f23c-45f4-88c2-0355b99d0114-logs\") pod \"nova-api-0\" (UID: \"d31fd98d-f23c-45f4-88c2-0355b99d0114\") " pod="openstack/nova-api-0" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.516844 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dj9p\" (UniqueName: \"kubernetes.io/projected/d31fd98d-f23c-45f4-88c2-0355b99d0114-kube-api-access-7dj9p\") pod \"nova-api-0\" (UID: \"d31fd98d-f23c-45f4-88c2-0355b99d0114\") " pod="openstack/nova-api-0" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.516876 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d31fd98d-f23c-45f4-88c2-0355b99d0114-config-data\") pod \"nova-api-0\" (UID: \"d31fd98d-f23c-45f4-88c2-0355b99d0114\") " pod="openstack/nova-api-0" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.516892 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d31fd98d-f23c-45f4-88c2-0355b99d0114-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d31fd98d-f23c-45f4-88c2-0355b99d0114\") " pod="openstack/nova-api-0" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.517859 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d31fd98d-f23c-45f4-88c2-0355b99d0114-logs\") pod \"nova-api-0\" (UID: \"d31fd98d-f23c-45f4-88c2-0355b99d0114\") " pod="openstack/nova-api-0" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.524918 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d31fd98d-f23c-45f4-88c2-0355b99d0114-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d31fd98d-f23c-45f4-88c2-0355b99d0114\") " pod="openstack/nova-api-0" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.535013 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dj9p\" (UniqueName: \"kubernetes.io/projected/d31fd98d-f23c-45f4-88c2-0355b99d0114-kube-api-access-7dj9p\") pod \"nova-api-0\" (UID: \"d31fd98d-f23c-45f4-88c2-0355b99d0114\") " pod="openstack/nova-api-0" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.535868 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d31fd98d-f23c-45f4-88c2-0355b99d0114-config-data\") pod \"nova-api-0\" (UID: \"d31fd98d-f23c-45f4-88c2-0355b99d0114\") " pod="openstack/nova-api-0" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.660460 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.697504 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.720766 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40308063-f8c5-4aae-99be-736964f4ec37-combined-ca-bundle\") pod \"40308063-f8c5-4aae-99be-736964f4ec37\" (UID: \"40308063-f8c5-4aae-99be-736964f4ec37\") " Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.720935 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfj92\" (UniqueName: \"kubernetes.io/projected/40308063-f8c5-4aae-99be-736964f4ec37-kube-api-access-qfj92\") pod \"40308063-f8c5-4aae-99be-736964f4ec37\" (UID: \"40308063-f8c5-4aae-99be-736964f4ec37\") " Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.721149 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40308063-f8c5-4aae-99be-736964f4ec37-config-data\") pod \"40308063-f8c5-4aae-99be-736964f4ec37\" (UID: \"40308063-f8c5-4aae-99be-736964f4ec37\") " Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.728643 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40308063-f8c5-4aae-99be-736964f4ec37-kube-api-access-qfj92" (OuterVolumeSpecName: "kube-api-access-qfj92") pod "40308063-f8c5-4aae-99be-736964f4ec37" (UID: "40308063-f8c5-4aae-99be-736964f4ec37"). InnerVolumeSpecName "kube-api-access-qfj92". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.755072 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40308063-f8c5-4aae-99be-736964f4ec37-config-data" (OuterVolumeSpecName: "config-data") pod "40308063-f8c5-4aae-99be-736964f4ec37" (UID: "40308063-f8c5-4aae-99be-736964f4ec37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.757577 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40308063-f8c5-4aae-99be-736964f4ec37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40308063-f8c5-4aae-99be-736964f4ec37" (UID: "40308063-f8c5-4aae-99be-736964f4ec37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.824653 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40308063-f8c5-4aae-99be-736964f4ec37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.824857 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfj92\" (UniqueName: \"kubernetes.io/projected/40308063-f8c5-4aae-99be-736964f4ec37-kube-api-access-qfj92\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:48 crc kubenswrapper[4936]: I0930 13:59:48.824886 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40308063-f8c5-4aae-99be-736964f4ec37-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.202211 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.250761 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d31fd98d-f23c-45f4-88c2-0355b99d0114","Type":"ContainerStarted","Data":"882455403eae3835d319e84067c162b1ec11d3f031fcc13445af54bad57b3923"} Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.258076 4936 generic.go:334] "Generic (PLEG): container finished" podID="40308063-f8c5-4aae-99be-736964f4ec37" containerID="92312ed8d1b3f07c2c6914605cf2cae646136c280584f97e843607032f86fb6f" exitCode=0 Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.258131 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"40308063-f8c5-4aae-99be-736964f4ec37","Type":"ContainerDied","Data":"92312ed8d1b3f07c2c6914605cf2cae646136c280584f97e843607032f86fb6f"} Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.258155 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"40308063-f8c5-4aae-99be-736964f4ec37","Type":"ContainerDied","Data":"65be52be26cbbac05626bf1f0f98179209525b31191abd43a6f45f84dae21587"} Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.258173 4936 scope.go:117] "RemoveContainer" containerID="92312ed8d1b3f07c2c6914605cf2cae646136c280584f97e843607032f86fb6f" Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.258260 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.311723 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.329317 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.354667 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:59:49 crc kubenswrapper[4936]: E0930 13:59:49.355264 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40308063-f8c5-4aae-99be-736964f4ec37" containerName="nova-scheduler-scheduler" Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.355292 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="40308063-f8c5-4aae-99be-736964f4ec37" containerName="nova-scheduler-scheduler" Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.355519 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="40308063-f8c5-4aae-99be-736964f4ec37" containerName="nova-scheduler-scheduler" Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.356319 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.359048 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.366122 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.369098 4936 scope.go:117] "RemoveContainer" containerID="92312ed8d1b3f07c2c6914605cf2cae646136c280584f97e843607032f86fb6f" Sep 30 13:59:49 crc kubenswrapper[4936]: E0930 13:59:49.371860 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92312ed8d1b3f07c2c6914605cf2cae646136c280584f97e843607032f86fb6f\": container with ID starting with 92312ed8d1b3f07c2c6914605cf2cae646136c280584f97e843607032f86fb6f not found: ID does not exist" containerID="92312ed8d1b3f07c2c6914605cf2cae646136c280584f97e843607032f86fb6f" Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.372102 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92312ed8d1b3f07c2c6914605cf2cae646136c280584f97e843607032f86fb6f"} err="failed to get container status \"92312ed8d1b3f07c2c6914605cf2cae646136c280584f97e843607032f86fb6f\": rpc error: code = NotFound desc = could not find container \"92312ed8d1b3f07c2c6914605cf2cae646136c280584f97e843607032f86fb6f\": container with ID starting with 92312ed8d1b3f07c2c6914605cf2cae646136c280584f97e843607032f86fb6f not found: ID does not exist" Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.442323 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f27948-4455-4a0f-9d53-eadc3e0cb80d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f1f27948-4455-4a0f-9d53-eadc3e0cb80d\") " pod="openstack/nova-scheduler-0" Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.442680 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggff5\" (UniqueName: \"kubernetes.io/projected/f1f27948-4455-4a0f-9d53-eadc3e0cb80d-kube-api-access-ggff5\") pod \"nova-scheduler-0\" (UID: \"f1f27948-4455-4a0f-9d53-eadc3e0cb80d\") " pod="openstack/nova-scheduler-0" Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.442858 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f27948-4455-4a0f-9d53-eadc3e0cb80d-config-data\") pod \"nova-scheduler-0\" (UID: \"f1f27948-4455-4a0f-9d53-eadc3e0cb80d\") " pod="openstack/nova-scheduler-0" Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.545318 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f27948-4455-4a0f-9d53-eadc3e0cb80d-config-data\") pod \"nova-scheduler-0\" (UID: \"f1f27948-4455-4a0f-9d53-eadc3e0cb80d\") " pod="openstack/nova-scheduler-0" Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.545451 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f27948-4455-4a0f-9d53-eadc3e0cb80d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f1f27948-4455-4a0f-9d53-eadc3e0cb80d\") " pod="openstack/nova-scheduler-0" Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.545484 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggff5\" (UniqueName: \"kubernetes.io/projected/f1f27948-4455-4a0f-9d53-eadc3e0cb80d-kube-api-access-ggff5\") pod \"nova-scheduler-0\" (UID: \"f1f27948-4455-4a0f-9d53-eadc3e0cb80d\") " pod="openstack/nova-scheduler-0" Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.555075 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f27948-4455-4a0f-9d53-eadc3e0cb80d-config-data\") pod \"nova-scheduler-0\" (UID: \"f1f27948-4455-4a0f-9d53-eadc3e0cb80d\") " pod="openstack/nova-scheduler-0" Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.555172 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f27948-4455-4a0f-9d53-eadc3e0cb80d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f1f27948-4455-4a0f-9d53-eadc3e0cb80d\") " pod="openstack/nova-scheduler-0" Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.561255 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggff5\" (UniqueName: \"kubernetes.io/projected/f1f27948-4455-4a0f-9d53-eadc3e0cb80d-kube-api-access-ggff5\") pod \"nova-scheduler-0\" (UID: \"f1f27948-4455-4a0f-9d53-eadc3e0cb80d\") " pod="openstack/nova-scheduler-0" Sep 30 13:59:49 crc kubenswrapper[4936]: I0930 13:59:49.682745 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 13:59:50 crc kubenswrapper[4936]: I0930 13:59:50.128630 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 13:59:50 crc kubenswrapper[4936]: I0930 13:59:50.291082 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d31fd98d-f23c-45f4-88c2-0355b99d0114","Type":"ContainerStarted","Data":"c562c862a2adff562e4b5fbf4362942efde2061c0277077d5b447335156cdfab"} Sep 30 13:59:50 crc kubenswrapper[4936]: I0930 13:59:50.291448 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d31fd98d-f23c-45f4-88c2-0355b99d0114","Type":"ContainerStarted","Data":"b153a7583cba771c1804dbd62d3b5dc7bcc4f41738ecd3161a43e2144d933854"} Sep 30 13:59:50 crc kubenswrapper[4936]: I0930 13:59:50.295407 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f1f27948-4455-4a0f-9d53-eadc3e0cb80d","Type":"ContainerStarted","Data":"95459dc27c1557aa908e6aabbfb663cd7899fb5e6a634fa236e0843e38b75eb8"} Sep 30 13:59:50 crc kubenswrapper[4936]: I0930 13:59:50.319802 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.319783797 podStartE2EDuration="2.319783797s" podCreationTimestamp="2025-09-30 13:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:50.311704016 +0000 UTC m=+1240.695706337" watchObservedRunningTime="2025-09-30 13:59:50.319783797 +0000 UTC m=+1240.703786098" Sep 30 13:59:50 crc kubenswrapper[4936]: I0930 13:59:50.329325 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40308063-f8c5-4aae-99be-736964f4ec37" path="/var/lib/kubelet/pods/40308063-f8c5-4aae-99be-736964f4ec37/volumes" Sep 30 13:59:50 crc kubenswrapper[4936]: I0930 13:59:50.330257 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="784bde60-fe66-4bf1-8cc3-0d66acb994c0" path="/var/lib/kubelet/pods/784bde60-fe66-4bf1-8cc3-0d66acb994c0/volumes" Sep 30 13:59:50 crc kubenswrapper[4936]: I0930 13:59:50.354711 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 13:59:50 crc kubenswrapper[4936]: I0930 13:59:50.665916 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 13:59:50 crc kubenswrapper[4936]: I0930 13:59:50.665963 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 13:59:51 crc kubenswrapper[4936]: I0930 13:59:51.306053 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f1f27948-4455-4a0f-9d53-eadc3e0cb80d","Type":"ContainerStarted","Data":"df779f5d1064456c9f73325a729cd0aee2c7854ead78629d9d877c317368758d"} Sep 30 13:59:51 crc kubenswrapper[4936]: I0930 13:59:51.334295 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.334273448 podStartE2EDuration="2.334273448s" podCreationTimestamp="2025-09-30 13:59:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:51.323957396 +0000 UTC m=+1241.707959707" watchObservedRunningTime="2025-09-30 13:59:51.334273448 +0000 UTC m=+1241.718275749" Sep 30 13:59:53 crc kubenswrapper[4936]: I0930 13:59:53.194594 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:59:53 crc kubenswrapper[4936]: I0930 13:59:53.194810 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="34168aff-c364-4158-a3e2-ff82841c060c" containerName="kube-state-metrics" containerID="cri-o://f3ce4becbff6479591b148a0a18bdcc170625663c71254f0ae5fcfe19cb8c5c1" gracePeriod=30 Sep 30 13:59:53 crc kubenswrapper[4936]: I0930 13:59:53.328488 4936 generic.go:334] "Generic (PLEG): container finished" podID="34168aff-c364-4158-a3e2-ff82841c060c" containerID="f3ce4becbff6479591b148a0a18bdcc170625663c71254f0ae5fcfe19cb8c5c1" exitCode=2 Sep 30 13:59:53 crc kubenswrapper[4936]: I0930 13:59:53.328796 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"34168aff-c364-4158-a3e2-ff82841c060c","Type":"ContainerDied","Data":"f3ce4becbff6479591b148a0a18bdcc170625663c71254f0ae5fcfe19cb8c5c1"} Sep 30 13:59:53 crc kubenswrapper[4936]: I0930 13:59:53.704942 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 13:59:53 crc kubenswrapper[4936]: I0930 13:59:53.745318 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82z9r\" (UniqueName: \"kubernetes.io/projected/34168aff-c364-4158-a3e2-ff82841c060c-kube-api-access-82z9r\") pod \"34168aff-c364-4158-a3e2-ff82841c060c\" (UID: \"34168aff-c364-4158-a3e2-ff82841c060c\") " Sep 30 13:59:53 crc kubenswrapper[4936]: I0930 13:59:53.752209 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34168aff-c364-4158-a3e2-ff82841c060c-kube-api-access-82z9r" (OuterVolumeSpecName: "kube-api-access-82z9r") pod "34168aff-c364-4158-a3e2-ff82841c060c" (UID: "34168aff-c364-4158-a3e2-ff82841c060c"). InnerVolumeSpecName "kube-api-access-82z9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:59:53 crc kubenswrapper[4936]: I0930 13:59:53.847261 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82z9r\" (UniqueName: \"kubernetes.io/projected/34168aff-c364-4158-a3e2-ff82841c060c-kube-api-access-82z9r\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.338250 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"34168aff-c364-4158-a3e2-ff82841c060c","Type":"ContainerDied","Data":"60b4acd06d609350f6d20a8841904a264111ed85fb1a33dda3a1452024c7a388"} Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.338300 4936 scope.go:117] "RemoveContainer" containerID="f3ce4becbff6479591b148a0a18bdcc170625663c71254f0ae5fcfe19cb8c5c1" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.338538 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.367739 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.382153 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.394004 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:59:54 crc kubenswrapper[4936]: E0930 13:59:54.394626 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34168aff-c364-4158-a3e2-ff82841c060c" containerName="kube-state-metrics" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.394649 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="34168aff-c364-4158-a3e2-ff82841c060c" containerName="kube-state-metrics" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.394875 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="34168aff-c364-4158-a3e2-ff82841c060c" containerName="kube-state-metrics" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.395653 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.398241 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.398773 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.447947 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.457481 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdj7g\" (UniqueName: \"kubernetes.io/projected/fdabb8fe-7d22-4bd5-8676-378dc4500f6e-kube-api-access-pdj7g\") pod \"kube-state-metrics-0\" (UID: \"fdabb8fe-7d22-4bd5-8676-378dc4500f6e\") " pod="openstack/kube-state-metrics-0" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.457568 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fdabb8fe-7d22-4bd5-8676-378dc4500f6e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"fdabb8fe-7d22-4bd5-8676-378dc4500f6e\") " pod="openstack/kube-state-metrics-0" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.457900 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdabb8fe-7d22-4bd5-8676-378dc4500f6e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"fdabb8fe-7d22-4bd5-8676-378dc4500f6e\") " pod="openstack/kube-state-metrics-0" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.457944 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdabb8fe-7d22-4bd5-8676-378dc4500f6e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"fdabb8fe-7d22-4bd5-8676-378dc4500f6e\") " pod="openstack/kube-state-metrics-0" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.479136 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.479430 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50b2b4d1-70a1-4044-b585-721988a65bda" containerName="ceilometer-central-agent" containerID="cri-o://0883237df2354bae649fbc62cccda782fa944d3b00a862d355605d00459b25f4" gracePeriod=30 Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.479686 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50b2b4d1-70a1-4044-b585-721988a65bda" containerName="proxy-httpd" containerID="cri-o://5ee267d93562e60efe3162a1223ae465a469b5aa8b3a365971338fc55749266c" gracePeriod=30 Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.479718 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50b2b4d1-70a1-4044-b585-721988a65bda" containerName="ceilometer-notification-agent" containerID="cri-o://fe04b181eaeb2940c239f60999bd0297ea9ac02342180a82a7c2e28eb262d643" gracePeriod=30 Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.479746 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50b2b4d1-70a1-4044-b585-721988a65bda" containerName="sg-core" containerID="cri-o://8229c455cf5d455225fdadfce1a102d5c128eca021c868080ab39c7ba1fc7615" gracePeriod=30 Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.559806 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdabb8fe-7d22-4bd5-8676-378dc4500f6e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"fdabb8fe-7d22-4bd5-8676-378dc4500f6e\") " pod="openstack/kube-state-metrics-0" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.559854 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdabb8fe-7d22-4bd5-8676-378dc4500f6e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"fdabb8fe-7d22-4bd5-8676-378dc4500f6e\") " pod="openstack/kube-state-metrics-0" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.559914 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdj7g\" (UniqueName: \"kubernetes.io/projected/fdabb8fe-7d22-4bd5-8676-378dc4500f6e-kube-api-access-pdj7g\") pod \"kube-state-metrics-0\" (UID: \"fdabb8fe-7d22-4bd5-8676-378dc4500f6e\") " pod="openstack/kube-state-metrics-0" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.559951 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fdabb8fe-7d22-4bd5-8676-378dc4500f6e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"fdabb8fe-7d22-4bd5-8676-378dc4500f6e\") " pod="openstack/kube-state-metrics-0" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.564100 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fdabb8fe-7d22-4bd5-8676-378dc4500f6e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"fdabb8fe-7d22-4bd5-8676-378dc4500f6e\") " pod="openstack/kube-state-metrics-0" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.566622 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdabb8fe-7d22-4bd5-8676-378dc4500f6e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"fdabb8fe-7d22-4bd5-8676-378dc4500f6e\") " pod="openstack/kube-state-metrics-0" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.569311 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdabb8fe-7d22-4bd5-8676-378dc4500f6e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"fdabb8fe-7d22-4bd5-8676-378dc4500f6e\") " pod="openstack/kube-state-metrics-0" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.583259 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdj7g\" (UniqueName: \"kubernetes.io/projected/fdabb8fe-7d22-4bd5-8676-378dc4500f6e-kube-api-access-pdj7g\") pod \"kube-state-metrics-0\" (UID: \"fdabb8fe-7d22-4bd5-8676-378dc4500f6e\") " pod="openstack/kube-state-metrics-0" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.683575 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 13:59:54 crc kubenswrapper[4936]: I0930 13:59:54.735028 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 13:59:55 crc kubenswrapper[4936]: I0930 13:59:55.240304 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 13:59:55 crc kubenswrapper[4936]: W0930 13:59:55.248643 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdabb8fe_7d22_4bd5_8676_378dc4500f6e.slice/crio-da610c6d1cf2f0dc0f069c7bd2b27b94723b6e60cd36e786d82f71e33fd87fc6 WatchSource:0}: Error finding container da610c6d1cf2f0dc0f069c7bd2b27b94723b6e60cd36e786d82f71e33fd87fc6: Status 404 returned error can't find the container with id da610c6d1cf2f0dc0f069c7bd2b27b94723b6e60cd36e786d82f71e33fd87fc6 Sep 30 13:59:55 crc kubenswrapper[4936]: I0930 13:59:55.252032 4936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 13:59:55 crc kubenswrapper[4936]: I0930 13:59:55.350305 4936 generic.go:334] "Generic (PLEG): container finished" podID="50b2b4d1-70a1-4044-b585-721988a65bda" containerID="5ee267d93562e60efe3162a1223ae465a469b5aa8b3a365971338fc55749266c" exitCode=0 Sep 30 13:59:55 crc kubenswrapper[4936]: I0930 13:59:55.350334 4936 generic.go:334] "Generic (PLEG): container finished" podID="50b2b4d1-70a1-4044-b585-721988a65bda" containerID="8229c455cf5d455225fdadfce1a102d5c128eca021c868080ab39c7ba1fc7615" exitCode=2 Sep 30 13:59:55 crc kubenswrapper[4936]: I0930 13:59:55.350360 4936 generic.go:334] "Generic (PLEG): container finished" podID="50b2b4d1-70a1-4044-b585-721988a65bda" containerID="0883237df2354bae649fbc62cccda782fa944d3b00a862d355605d00459b25f4" exitCode=0 Sep 30 13:59:55 crc kubenswrapper[4936]: I0930 13:59:55.350392 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50b2b4d1-70a1-4044-b585-721988a65bda","Type":"ContainerDied","Data":"5ee267d93562e60efe3162a1223ae465a469b5aa8b3a365971338fc55749266c"} Sep 30 13:59:55 crc kubenswrapper[4936]: I0930 13:59:55.350436 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50b2b4d1-70a1-4044-b585-721988a65bda","Type":"ContainerDied","Data":"8229c455cf5d455225fdadfce1a102d5c128eca021c868080ab39c7ba1fc7615"} Sep 30 13:59:55 crc kubenswrapper[4936]: I0930 13:59:55.350448 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50b2b4d1-70a1-4044-b585-721988a65bda","Type":"ContainerDied","Data":"0883237df2354bae649fbc62cccda782fa944d3b00a862d355605d00459b25f4"} Sep 30 13:59:55 crc kubenswrapper[4936]: I0930 13:59:55.351397 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fdabb8fe-7d22-4bd5-8676-378dc4500f6e","Type":"ContainerStarted","Data":"da610c6d1cf2f0dc0f069c7bd2b27b94723b6e60cd36e786d82f71e33fd87fc6"} Sep 30 13:59:55 crc kubenswrapper[4936]: I0930 13:59:55.618878 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Sep 30 13:59:55 crc kubenswrapper[4936]: I0930 13:59:55.671245 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 13:59:55 crc kubenswrapper[4936]: I0930 13:59:55.671298 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 13:59:56 crc kubenswrapper[4936]: I0930 13:59:56.327140 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34168aff-c364-4158-a3e2-ff82841c060c" path="/var/lib/kubelet/pods/34168aff-c364-4158-a3e2-ff82841c060c/volumes" Sep 30 13:59:56 crc kubenswrapper[4936]: I0930 13:59:56.365101 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fdabb8fe-7d22-4bd5-8676-378dc4500f6e","Type":"ContainerStarted","Data":"c06bf4c9c6679efb02b59748a2b1544962814649130c2e9765e90676a6323f5b"} Sep 30 13:59:56 crc kubenswrapper[4936]: I0930 13:59:56.365274 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 13:59:56 crc kubenswrapper[4936]: I0930 13:59:56.387118 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.00511703 podStartE2EDuration="2.387094354s" podCreationTimestamp="2025-09-30 13:59:54 +0000 UTC" firstStartedPulling="2025-09-30 13:59:55.251750732 +0000 UTC m=+1245.635753033" lastFinishedPulling="2025-09-30 13:59:55.633728056 +0000 UTC m=+1246.017730357" observedRunningTime="2025-09-30 13:59:56.381155172 +0000 UTC m=+1246.765157483" watchObservedRunningTime="2025-09-30 13:59:56.387094354 +0000 UTC m=+1246.771096655" Sep 30 13:59:56 crc kubenswrapper[4936]: I0930 13:59:56.684553 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="438cb977-993f-4bd0-b3d4-a1b81ecbadff" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.176:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 13:59:56 crc kubenswrapper[4936]: I0930 13:59:56.684575 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="438cb977-993f-4bd0-b3d4-a1b81ecbadff" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.176:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 13:59:58 crc kubenswrapper[4936]: I0930 13:59:58.698464 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 13:59:58 crc kubenswrapper[4936]: I0930 13:59:58.699061 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 13:59:58 crc kubenswrapper[4936]: I0930 13:59:58.857082 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:59:58 crc kubenswrapper[4936]: I0930 13:59:58.946016 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-config-data\") pod \"50b2b4d1-70a1-4044-b585-721988a65bda\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " Sep 30 13:59:58 crc kubenswrapper[4936]: I0930 13:59:58.946098 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-sg-core-conf-yaml\") pod \"50b2b4d1-70a1-4044-b585-721988a65bda\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " Sep 30 13:59:58 crc kubenswrapper[4936]: I0930 13:59:58.946163 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-combined-ca-bundle\") pod \"50b2b4d1-70a1-4044-b585-721988a65bda\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " Sep 30 13:59:58 crc kubenswrapper[4936]: I0930 13:59:58.946197 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50b2b4d1-70a1-4044-b585-721988a65bda-log-httpd\") pod \"50b2b4d1-70a1-4044-b585-721988a65bda\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " Sep 30 13:59:58 crc kubenswrapper[4936]: I0930 13:59:58.946253 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50b2b4d1-70a1-4044-b585-721988a65bda-run-httpd\") pod \"50b2b4d1-70a1-4044-b585-721988a65bda\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " Sep 30 13:59:58 crc kubenswrapper[4936]: I0930 13:59:58.946321 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkvpn\" (UniqueName: \"kubernetes.io/projected/50b2b4d1-70a1-4044-b585-721988a65bda-kube-api-access-rkvpn\") pod \"50b2b4d1-70a1-4044-b585-721988a65bda\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " Sep 30 13:59:58 crc kubenswrapper[4936]: I0930 13:59:58.946408 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-scripts\") pod \"50b2b4d1-70a1-4044-b585-721988a65bda\" (UID: \"50b2b4d1-70a1-4044-b585-721988a65bda\") " Sep 30 13:59:58 crc kubenswrapper[4936]: I0930 13:59:58.947950 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50b2b4d1-70a1-4044-b585-721988a65bda-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "50b2b4d1-70a1-4044-b585-721988a65bda" (UID: "50b2b4d1-70a1-4044-b585-721988a65bda"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:59:58 crc kubenswrapper[4936]: I0930 13:59:58.948588 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50b2b4d1-70a1-4044-b585-721988a65bda-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "50b2b4d1-70a1-4044-b585-721988a65bda" (UID: "50b2b4d1-70a1-4044-b585-721988a65bda"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:59:58 crc kubenswrapper[4936]: I0930 13:59:58.968125 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-scripts" (OuterVolumeSpecName: "scripts") pod "50b2b4d1-70a1-4044-b585-721988a65bda" (UID: "50b2b4d1-70a1-4044-b585-721988a65bda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.038177 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50b2b4d1-70a1-4044-b585-721988a65bda-kube-api-access-rkvpn" (OuterVolumeSpecName: "kube-api-access-rkvpn") pod "50b2b4d1-70a1-4044-b585-721988a65bda" (UID: "50b2b4d1-70a1-4044-b585-721988a65bda"). InnerVolumeSpecName "kube-api-access-rkvpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.051956 4936 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50b2b4d1-70a1-4044-b585-721988a65bda-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.051999 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkvpn\" (UniqueName: \"kubernetes.io/projected/50b2b4d1-70a1-4044-b585-721988a65bda-kube-api-access-rkvpn\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.052012 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.052024 4936 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50b2b4d1-70a1-4044-b585-721988a65bda-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.115478 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "50b2b4d1-70a1-4044-b585-721988a65bda" (UID: "50b2b4d1-70a1-4044-b585-721988a65bda"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.151274 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50b2b4d1-70a1-4044-b585-721988a65bda" (UID: "50b2b4d1-70a1-4044-b585-721988a65bda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.153689 4936 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.153741 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.182300 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-config-data" (OuterVolumeSpecName: "config-data") pod "50b2b4d1-70a1-4044-b585-721988a65bda" (UID: "50b2b4d1-70a1-4044-b585-721988a65bda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.255932 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b2b4d1-70a1-4044-b585-721988a65bda-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.401812 4936 generic.go:334] "Generic (PLEG): container finished" podID="50b2b4d1-70a1-4044-b585-721988a65bda" containerID="fe04b181eaeb2940c239f60999bd0297ea9ac02342180a82a7c2e28eb262d643" exitCode=0 Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.401862 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50b2b4d1-70a1-4044-b585-721988a65bda","Type":"ContainerDied","Data":"fe04b181eaeb2940c239f60999bd0297ea9ac02342180a82a7c2e28eb262d643"} Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.401890 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50b2b4d1-70a1-4044-b585-721988a65bda","Type":"ContainerDied","Data":"9571276b8c2dcf819213bca09fce0af7dc8dced19f37eb5c1f39884a3beb3f7a"} Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.401906 4936 scope.go:117] "RemoveContainer" containerID="5ee267d93562e60efe3162a1223ae465a469b5aa8b3a365971338fc55749266c" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.401914 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.444514 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.462569 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.475681 4936 scope.go:117] "RemoveContainer" containerID="8229c455cf5d455225fdadfce1a102d5c128eca021c868080ab39c7ba1fc7615" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.478572 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:59:59 crc kubenswrapper[4936]: E0930 13:59:59.478921 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b2b4d1-70a1-4044-b585-721988a65bda" containerName="proxy-httpd" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.478937 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b2b4d1-70a1-4044-b585-721988a65bda" containerName="proxy-httpd" Sep 30 13:59:59 crc kubenswrapper[4936]: E0930 13:59:59.478951 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b2b4d1-70a1-4044-b585-721988a65bda" containerName="ceilometer-notification-agent" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.478958 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b2b4d1-70a1-4044-b585-721988a65bda" containerName="ceilometer-notification-agent" Sep 30 13:59:59 crc kubenswrapper[4936]: E0930 13:59:59.478978 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b2b4d1-70a1-4044-b585-721988a65bda" containerName="sg-core" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.478984 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b2b4d1-70a1-4044-b585-721988a65bda" containerName="sg-core" Sep 30 13:59:59 crc kubenswrapper[4936]: E0930 13:59:59.479008 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b2b4d1-70a1-4044-b585-721988a65bda" containerName="ceilometer-central-agent" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.479014 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b2b4d1-70a1-4044-b585-721988a65bda" containerName="ceilometer-central-agent" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.479170 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b2b4d1-70a1-4044-b585-721988a65bda" containerName="ceilometer-notification-agent" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.479181 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b2b4d1-70a1-4044-b585-721988a65bda" containerName="proxy-httpd" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.479196 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b2b4d1-70a1-4044-b585-721988a65bda" containerName="ceilometer-central-agent" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.479209 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b2b4d1-70a1-4044-b585-721988a65bda" containerName="sg-core" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.494963 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.495058 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.500191 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.500383 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.500651 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.544614 4936 scope.go:117] "RemoveContainer" containerID="fe04b181eaeb2940c239f60999bd0297ea9ac02342180a82a7c2e28eb262d643" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.564882 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-scripts\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.564947 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tngk9\" (UniqueName: \"kubernetes.io/projected/3311e389-86f6-468f-88aa-84c61d64b978-kube-api-access-tngk9\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.564998 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-config-data\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.565019 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.565076 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.565176 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3311e389-86f6-468f-88aa-84c61d64b978-log-httpd\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.565357 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3311e389-86f6-468f-88aa-84c61d64b978-run-httpd\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.565377 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.627458 4936 scope.go:117] "RemoveContainer" containerID="0883237df2354bae649fbc62cccda782fa944d3b00a862d355605d00459b25f4" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.667272 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3311e389-86f6-468f-88aa-84c61d64b978-run-httpd\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.667347 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.667385 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-scripts\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.667401 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-config-data\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.667416 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.667445 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tngk9\" (UniqueName: \"kubernetes.io/projected/3311e389-86f6-468f-88aa-84c61d64b978-kube-api-access-tngk9\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.667483 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.667559 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3311e389-86f6-468f-88aa-84c61d64b978-log-httpd\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.668137 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3311e389-86f6-468f-88aa-84c61d64b978-log-httpd\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.668453 4936 scope.go:117] "RemoveContainer" containerID="5ee267d93562e60efe3162a1223ae465a469b5aa8b3a365971338fc55749266c" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.668758 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3311e389-86f6-468f-88aa-84c61d64b978-run-httpd\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: E0930 13:59:59.672108 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ee267d93562e60efe3162a1223ae465a469b5aa8b3a365971338fc55749266c\": container with ID starting with 5ee267d93562e60efe3162a1223ae465a469b5aa8b3a365971338fc55749266c not found: ID does not exist" containerID="5ee267d93562e60efe3162a1223ae465a469b5aa8b3a365971338fc55749266c" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.672279 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee267d93562e60efe3162a1223ae465a469b5aa8b3a365971338fc55749266c"} err="failed to get container status \"5ee267d93562e60efe3162a1223ae465a469b5aa8b3a365971338fc55749266c\": rpc error: code = NotFound desc = could not find container \"5ee267d93562e60efe3162a1223ae465a469b5aa8b3a365971338fc55749266c\": container with ID starting with 5ee267d93562e60efe3162a1223ae465a469b5aa8b3a365971338fc55749266c not found: ID does not exist" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.672430 4936 scope.go:117] "RemoveContainer" containerID="8229c455cf5d455225fdadfce1a102d5c128eca021c868080ab39c7ba1fc7615" Sep 30 13:59:59 crc kubenswrapper[4936]: E0930 13:59:59.672899 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8229c455cf5d455225fdadfce1a102d5c128eca021c868080ab39c7ba1fc7615\": container with ID starting with 8229c455cf5d455225fdadfce1a102d5c128eca021c868080ab39c7ba1fc7615 not found: ID does not exist" containerID="8229c455cf5d455225fdadfce1a102d5c128eca021c868080ab39c7ba1fc7615" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.673049 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8229c455cf5d455225fdadfce1a102d5c128eca021c868080ab39c7ba1fc7615"} err="failed to get container status \"8229c455cf5d455225fdadfce1a102d5c128eca021c868080ab39c7ba1fc7615\": rpc error: code = NotFound desc = could not find container \"8229c455cf5d455225fdadfce1a102d5c128eca021c868080ab39c7ba1fc7615\": container with ID starting with 8229c455cf5d455225fdadfce1a102d5c128eca021c868080ab39c7ba1fc7615 not found: ID does not exist" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.673158 4936 scope.go:117] "RemoveContainer" containerID="fe04b181eaeb2940c239f60999bd0297ea9ac02342180a82a7c2e28eb262d643" Sep 30 13:59:59 crc kubenswrapper[4936]: E0930 13:59:59.673490 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe04b181eaeb2940c239f60999bd0297ea9ac02342180a82a7c2e28eb262d643\": container with ID starting with fe04b181eaeb2940c239f60999bd0297ea9ac02342180a82a7c2e28eb262d643 not found: ID does not exist" containerID="fe04b181eaeb2940c239f60999bd0297ea9ac02342180a82a7c2e28eb262d643" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.673617 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe04b181eaeb2940c239f60999bd0297ea9ac02342180a82a7c2e28eb262d643"} err="failed to get container status \"fe04b181eaeb2940c239f60999bd0297ea9ac02342180a82a7c2e28eb262d643\": rpc error: code = NotFound desc = could not find container \"fe04b181eaeb2940c239f60999bd0297ea9ac02342180a82a7c2e28eb262d643\": container with ID starting with fe04b181eaeb2940c239f60999bd0297ea9ac02342180a82a7c2e28eb262d643 not found: ID does not exist" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.673717 4936 scope.go:117] "RemoveContainer" containerID="0883237df2354bae649fbc62cccda782fa944d3b00a862d355605d00459b25f4" Sep 30 13:59:59 crc kubenswrapper[4936]: E0930 13:59:59.674018 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0883237df2354bae649fbc62cccda782fa944d3b00a862d355605d00459b25f4\": container with ID starting with 0883237df2354bae649fbc62cccda782fa944d3b00a862d355605d00459b25f4 not found: ID does not exist" containerID="0883237df2354bae649fbc62cccda782fa944d3b00a862d355605d00459b25f4" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.674127 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0883237df2354bae649fbc62cccda782fa944d3b00a862d355605d00459b25f4"} err="failed to get container status \"0883237df2354bae649fbc62cccda782fa944d3b00a862d355605d00459b25f4\": rpc error: code = NotFound desc = could not find container \"0883237df2354bae649fbc62cccda782fa944d3b00a862d355605d00459b25f4\": container with ID starting with 0883237df2354bae649fbc62cccda782fa944d3b00a862d355605d00459b25f4 not found: ID does not exist" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.683904 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.684455 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.686967 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.687506 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.688228 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tngk9\" (UniqueName: \"kubernetes.io/projected/3311e389-86f6-468f-88aa-84c61d64b978-kube-api-access-tngk9\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.690030 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-scripts\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.691366 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-config-data\") pod \"ceilometer-0\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " pod="openstack/ceilometer-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.754065 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.781575 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d31fd98d-f23c-45f4-88c2-0355b99d0114" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.177:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.782044 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d31fd98d-f23c-45f4-88c2-0355b99d0114" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.177:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 13:59:59 crc kubenswrapper[4936]: I0930 13:59:59.819100 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:00:00 crc kubenswrapper[4936]: I0930 14:00:00.123665 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:00:00 crc kubenswrapper[4936]: I0930 14:00:00.171856 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth"] Sep 30 14:00:00 crc kubenswrapper[4936]: I0930 14:00:00.173321 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth" Sep 30 14:00:00 crc kubenswrapper[4936]: I0930 14:00:00.176901 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 14:00:00 crc kubenswrapper[4936]: I0930 14:00:00.177609 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 14:00:00 crc kubenswrapper[4936]: I0930 14:00:00.191738 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth"] Sep 30 14:00:00 crc kubenswrapper[4936]: I0930 14:00:00.282521 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9758fa00-0827-4d33-838d-9b9ab9703548-secret-volume\") pod \"collect-profiles-29320680-wcwth\" (UID: \"9758fa00-0827-4d33-838d-9b9ab9703548\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth" Sep 30 14:00:00 crc kubenswrapper[4936]: I0930 14:00:00.282581 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhrhj\" (UniqueName: \"kubernetes.io/projected/9758fa00-0827-4d33-838d-9b9ab9703548-kube-api-access-hhrhj\") pod \"collect-profiles-29320680-wcwth\" (UID: \"9758fa00-0827-4d33-838d-9b9ab9703548\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth" Sep 30 14:00:00 crc kubenswrapper[4936]: I0930 14:00:00.282622 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9758fa00-0827-4d33-838d-9b9ab9703548-config-volume\") pod \"collect-profiles-29320680-wcwth\" (UID: \"9758fa00-0827-4d33-838d-9b9ab9703548\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth" Sep 30 14:00:00 crc kubenswrapper[4936]: I0930 14:00:00.332920 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50b2b4d1-70a1-4044-b585-721988a65bda" path="/var/lib/kubelet/pods/50b2b4d1-70a1-4044-b585-721988a65bda/volumes" Sep 30 14:00:00 crc kubenswrapper[4936]: I0930 14:00:00.384517 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhrhj\" (UniqueName: \"kubernetes.io/projected/9758fa00-0827-4d33-838d-9b9ab9703548-kube-api-access-hhrhj\") pod \"collect-profiles-29320680-wcwth\" (UID: \"9758fa00-0827-4d33-838d-9b9ab9703548\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth" Sep 30 14:00:00 crc kubenswrapper[4936]: I0930 14:00:00.384663 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9758fa00-0827-4d33-838d-9b9ab9703548-config-volume\") pod \"collect-profiles-29320680-wcwth\" (UID: \"9758fa00-0827-4d33-838d-9b9ab9703548\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth" Sep 30 14:00:00 crc kubenswrapper[4936]: I0930 14:00:00.384823 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9758fa00-0827-4d33-838d-9b9ab9703548-secret-volume\") pod \"collect-profiles-29320680-wcwth\" (UID: \"9758fa00-0827-4d33-838d-9b9ab9703548\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth" Sep 30 14:00:00 crc kubenswrapper[4936]: I0930 14:00:00.393358 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9758fa00-0827-4d33-838d-9b9ab9703548-secret-volume\") pod \"collect-profiles-29320680-wcwth\" (UID: \"9758fa00-0827-4d33-838d-9b9ab9703548\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth" Sep 30 14:00:00 crc kubenswrapper[4936]: I0930 14:00:00.396780 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9758fa00-0827-4d33-838d-9b9ab9703548-config-volume\") pod \"collect-profiles-29320680-wcwth\" (UID: \"9758fa00-0827-4d33-838d-9b9ab9703548\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth" Sep 30 14:00:00 crc kubenswrapper[4936]: I0930 14:00:00.408691 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhrhj\" (UniqueName: \"kubernetes.io/projected/9758fa00-0827-4d33-838d-9b9ab9703548-kube-api-access-hhrhj\") pod \"collect-profiles-29320680-wcwth\" (UID: \"9758fa00-0827-4d33-838d-9b9ab9703548\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth" Sep 30 14:00:00 crc kubenswrapper[4936]: I0930 14:00:00.416028 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3311e389-86f6-468f-88aa-84c61d64b978","Type":"ContainerStarted","Data":"f369bb1138af610eff59ecd2801911cafcc93a6f2d997f73d2003db36902cf8c"} Sep 30 14:00:00 crc kubenswrapper[4936]: I0930 14:00:00.464236 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 14:00:00 crc kubenswrapper[4936]: I0930 14:00:00.506771 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth" Sep 30 14:00:01 crc kubenswrapper[4936]: I0930 14:00:01.201278 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth"] Sep 30 14:00:01 crc kubenswrapper[4936]: I0930 14:00:01.438242 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3311e389-86f6-468f-88aa-84c61d64b978","Type":"ContainerStarted","Data":"948828a677004030585a0e396a8dbdebcb84decd8a4d82eb67f8b646340c176d"} Sep 30 14:00:01 crc kubenswrapper[4936]: I0930 14:00:01.445198 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth" event={"ID":"9758fa00-0827-4d33-838d-9b9ab9703548","Type":"ContainerStarted","Data":"93b35abe5f61de46dc0d94df4733b555b32f5a4da82bb5e5bed0c37b869bf943"} Sep 30 14:00:02 crc kubenswrapper[4936]: I0930 14:00:02.455238 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3311e389-86f6-468f-88aa-84c61d64b978","Type":"ContainerStarted","Data":"279419ad4ef00b21eb87bc7522aa51d1c434015cd81d6daf7bb8030725293f62"} Sep 30 14:00:02 crc kubenswrapper[4936]: I0930 14:00:02.457865 4936 generic.go:334] "Generic (PLEG): container finished" podID="9758fa00-0827-4d33-838d-9b9ab9703548" containerID="b5823dbb08cfd4ccd812e5f6f8004fa251be6f9b8f0123737355d7f0adc4ec58" exitCode=0 Sep 30 14:00:02 crc kubenswrapper[4936]: I0930 14:00:02.457893 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth" event={"ID":"9758fa00-0827-4d33-838d-9b9ab9703548","Type":"ContainerDied","Data":"b5823dbb08cfd4ccd812e5f6f8004fa251be6f9b8f0123737355d7f0adc4ec58"} Sep 30 14:00:03 crc kubenswrapper[4936]: I0930 14:00:03.469690 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3311e389-86f6-468f-88aa-84c61d64b978","Type":"ContainerStarted","Data":"b5406d385eb72b8eecad7f78d1f228d072d5dce181eed270562721c712f5ad88"} Sep 30 14:00:03 crc kubenswrapper[4936]: I0930 14:00:03.824954 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth" Sep 30 14:00:03 crc kubenswrapper[4936]: I0930 14:00:03.870071 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9758fa00-0827-4d33-838d-9b9ab9703548-config-volume\") pod \"9758fa00-0827-4d33-838d-9b9ab9703548\" (UID: \"9758fa00-0827-4d33-838d-9b9ab9703548\") " Sep 30 14:00:03 crc kubenswrapper[4936]: I0930 14:00:03.870825 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9758fa00-0827-4d33-838d-9b9ab9703548-config-volume" (OuterVolumeSpecName: "config-volume") pod "9758fa00-0827-4d33-838d-9b9ab9703548" (UID: "9758fa00-0827-4d33-838d-9b9ab9703548"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:00:03 crc kubenswrapper[4936]: I0930 14:00:03.872032 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhrhj\" (UniqueName: \"kubernetes.io/projected/9758fa00-0827-4d33-838d-9b9ab9703548-kube-api-access-hhrhj\") pod \"9758fa00-0827-4d33-838d-9b9ab9703548\" (UID: \"9758fa00-0827-4d33-838d-9b9ab9703548\") " Sep 30 14:00:03 crc kubenswrapper[4936]: I0930 14:00:03.872833 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9758fa00-0827-4d33-838d-9b9ab9703548-secret-volume\") pod \"9758fa00-0827-4d33-838d-9b9ab9703548\" (UID: \"9758fa00-0827-4d33-838d-9b9ab9703548\") " Sep 30 14:00:03 crc kubenswrapper[4936]: I0930 14:00:03.873281 4936 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9758fa00-0827-4d33-838d-9b9ab9703548-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:03 crc kubenswrapper[4936]: I0930 14:00:03.879582 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9758fa00-0827-4d33-838d-9b9ab9703548-kube-api-access-hhrhj" (OuterVolumeSpecName: "kube-api-access-hhrhj") pod "9758fa00-0827-4d33-838d-9b9ab9703548" (UID: "9758fa00-0827-4d33-838d-9b9ab9703548"). InnerVolumeSpecName "kube-api-access-hhrhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:00:03 crc kubenswrapper[4936]: I0930 14:00:03.893049 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9758fa00-0827-4d33-838d-9b9ab9703548-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9758fa00-0827-4d33-838d-9b9ab9703548" (UID: "9758fa00-0827-4d33-838d-9b9ab9703548"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:03 crc kubenswrapper[4936]: I0930 14:00:03.975081 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhrhj\" (UniqueName: \"kubernetes.io/projected/9758fa00-0827-4d33-838d-9b9ab9703548-kube-api-access-hhrhj\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:03 crc kubenswrapper[4936]: I0930 14:00:03.975115 4936 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9758fa00-0827-4d33-838d-9b9ab9703548-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.363023 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.389825 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17dcf6d7-edd8-49ea-8bd3-83e59c211ef9-combined-ca-bundle\") pod \"17dcf6d7-edd8-49ea-8bd3-83e59c211ef9\" (UID: \"17dcf6d7-edd8-49ea-8bd3-83e59c211ef9\") " Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.390260 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfztk\" (UniqueName: \"kubernetes.io/projected/17dcf6d7-edd8-49ea-8bd3-83e59c211ef9-kube-api-access-xfztk\") pod \"17dcf6d7-edd8-49ea-8bd3-83e59c211ef9\" (UID: \"17dcf6d7-edd8-49ea-8bd3-83e59c211ef9\") " Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.390430 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17dcf6d7-edd8-49ea-8bd3-83e59c211ef9-config-data\") pod \"17dcf6d7-edd8-49ea-8bd3-83e59c211ef9\" (UID: \"17dcf6d7-edd8-49ea-8bd3-83e59c211ef9\") " Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.395626 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17dcf6d7-edd8-49ea-8bd3-83e59c211ef9-kube-api-access-xfztk" (OuterVolumeSpecName: "kube-api-access-xfztk") pod "17dcf6d7-edd8-49ea-8bd3-83e59c211ef9" (UID: "17dcf6d7-edd8-49ea-8bd3-83e59c211ef9"). InnerVolumeSpecName "kube-api-access-xfztk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.426278 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17dcf6d7-edd8-49ea-8bd3-83e59c211ef9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17dcf6d7-edd8-49ea-8bd3-83e59c211ef9" (UID: "17dcf6d7-edd8-49ea-8bd3-83e59c211ef9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.436013 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17dcf6d7-edd8-49ea-8bd3-83e59c211ef9-config-data" (OuterVolumeSpecName: "config-data") pod "17dcf6d7-edd8-49ea-8bd3-83e59c211ef9" (UID: "17dcf6d7-edd8-49ea-8bd3-83e59c211ef9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.481629 4936 generic.go:334] "Generic (PLEG): container finished" podID="17dcf6d7-edd8-49ea-8bd3-83e59c211ef9" containerID="e9e03e07e7d9ffdd8ec7520b8f927c09eaf501a02eda460804bda1e3a8fa6ea0" exitCode=137 Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.481732 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"17dcf6d7-edd8-49ea-8bd3-83e59c211ef9","Type":"ContainerDied","Data":"e9e03e07e7d9ffdd8ec7520b8f927c09eaf501a02eda460804bda1e3a8fa6ea0"} Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.481758 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"17dcf6d7-edd8-49ea-8bd3-83e59c211ef9","Type":"ContainerDied","Data":"0e56b8915e3b3b0cf2cedfd4a79623bbdc6dd3628bf4a3460df5092fffa4e145"} Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.481773 4936 scope.go:117] "RemoveContainer" containerID="e9e03e07e7d9ffdd8ec7520b8f927c09eaf501a02eda460804bda1e3a8fa6ea0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.481782 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.483995 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth" event={"ID":"9758fa00-0827-4d33-838d-9b9ab9703548","Type":"ContainerDied","Data":"93b35abe5f61de46dc0d94df4733b555b32f5a4da82bb5e5bed0c37b869bf943"} Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.484300 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93b35abe5f61de46dc0d94df4733b555b32f5a4da82bb5e5bed0c37b869bf943" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.484449 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.492368 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfztk\" (UniqueName: \"kubernetes.io/projected/17dcf6d7-edd8-49ea-8bd3-83e59c211ef9-kube-api-access-xfztk\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.492395 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17dcf6d7-edd8-49ea-8bd3-83e59c211ef9-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.492404 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17dcf6d7-edd8-49ea-8bd3-83e59c211ef9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.516193 4936 scope.go:117] "RemoveContainer" containerID="e9e03e07e7d9ffdd8ec7520b8f927c09eaf501a02eda460804bda1e3a8fa6ea0" Sep 30 14:00:04 crc kubenswrapper[4936]: E0930 14:00:04.516518 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9e03e07e7d9ffdd8ec7520b8f927c09eaf501a02eda460804bda1e3a8fa6ea0\": container with ID starting with e9e03e07e7d9ffdd8ec7520b8f927c09eaf501a02eda460804bda1e3a8fa6ea0 not found: ID does not exist" containerID="e9e03e07e7d9ffdd8ec7520b8f927c09eaf501a02eda460804bda1e3a8fa6ea0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.516607 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9e03e07e7d9ffdd8ec7520b8f927c09eaf501a02eda460804bda1e3a8fa6ea0"} err="failed to get container status \"e9e03e07e7d9ffdd8ec7520b8f927c09eaf501a02eda460804bda1e3a8fa6ea0\": rpc error: code = NotFound desc = could not find container \"e9e03e07e7d9ffdd8ec7520b8f927c09eaf501a02eda460804bda1e3a8fa6ea0\": container with ID starting with e9e03e07e7d9ffdd8ec7520b8f927c09eaf501a02eda460804bda1e3a8fa6ea0 not found: ID does not exist" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.525188 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.535240 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.545436 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 14:00:04 crc kubenswrapper[4936]: E0930 14:00:04.546035 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9758fa00-0827-4d33-838d-9b9ab9703548" containerName="collect-profiles" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.546105 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9758fa00-0827-4d33-838d-9b9ab9703548" containerName="collect-profiles" Sep 30 14:00:04 crc kubenswrapper[4936]: E0930 14:00:04.546171 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17dcf6d7-edd8-49ea-8bd3-83e59c211ef9" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.546234 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="17dcf6d7-edd8-49ea-8bd3-83e59c211ef9" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.546487 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="9758fa00-0827-4d33-838d-9b9ab9703548" containerName="collect-profiles" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.546574 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="17dcf6d7-edd8-49ea-8bd3-83e59c211ef9" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.547216 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.550424 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.550582 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.550701 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.553590 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.594487 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/54f05d80-47b0-406c-a0be-856756410f2a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"54f05d80-47b0-406c-a0be-856756410f2a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.594585 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54f05d80-47b0-406c-a0be-856756410f2a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"54f05d80-47b0-406c-a0be-856756410f2a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.594644 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkdz7\" (UniqueName: \"kubernetes.io/projected/54f05d80-47b0-406c-a0be-856756410f2a-kube-api-access-lkdz7\") pod \"nova-cell1-novncproxy-0\" (UID: \"54f05d80-47b0-406c-a0be-856756410f2a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.594867 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54f05d80-47b0-406c-a0be-856756410f2a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"54f05d80-47b0-406c-a0be-856756410f2a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.594932 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/54f05d80-47b0-406c-a0be-856756410f2a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"54f05d80-47b0-406c-a0be-856756410f2a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.697116 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/54f05d80-47b0-406c-a0be-856756410f2a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"54f05d80-47b0-406c-a0be-856756410f2a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.697434 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54f05d80-47b0-406c-a0be-856756410f2a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"54f05d80-47b0-406c-a0be-856756410f2a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.697469 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkdz7\" (UniqueName: \"kubernetes.io/projected/54f05d80-47b0-406c-a0be-856756410f2a-kube-api-access-lkdz7\") pod \"nova-cell1-novncproxy-0\" (UID: \"54f05d80-47b0-406c-a0be-856756410f2a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.699557 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54f05d80-47b0-406c-a0be-856756410f2a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"54f05d80-47b0-406c-a0be-856756410f2a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.699608 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/54f05d80-47b0-406c-a0be-856756410f2a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"54f05d80-47b0-406c-a0be-856756410f2a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.701587 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/54f05d80-47b0-406c-a0be-856756410f2a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"54f05d80-47b0-406c-a0be-856756410f2a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.711952 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54f05d80-47b0-406c-a0be-856756410f2a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"54f05d80-47b0-406c-a0be-856756410f2a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.714630 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54f05d80-47b0-406c-a0be-856756410f2a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"54f05d80-47b0-406c-a0be-856756410f2a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.714986 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/54f05d80-47b0-406c-a0be-856756410f2a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"54f05d80-47b0-406c-a0be-856756410f2a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.716611 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkdz7\" (UniqueName: \"kubernetes.io/projected/54f05d80-47b0-406c-a0be-856756410f2a-kube-api-access-lkdz7\") pod \"nova-cell1-novncproxy-0\" (UID: \"54f05d80-47b0-406c-a0be-856756410f2a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.748010 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 14:00:04 crc kubenswrapper[4936]: I0930 14:00:04.882565 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:05 crc kubenswrapper[4936]: I0930 14:00:05.366749 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 14:00:05 crc kubenswrapper[4936]: I0930 14:00:05.496750 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"54f05d80-47b0-406c-a0be-856756410f2a","Type":"ContainerStarted","Data":"2f12267260a30f0cf99fd3443358ca85a18829cb271f1ac1cbf17c8dd493406e"} Sep 30 14:00:05 crc kubenswrapper[4936]: I0930 14:00:05.675933 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 14:00:05 crc kubenswrapper[4936]: I0930 14:00:05.682919 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 14:00:05 crc kubenswrapper[4936]: I0930 14:00:05.684128 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 14:00:06 crc kubenswrapper[4936]: I0930 14:00:06.327378 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17dcf6d7-edd8-49ea-8bd3-83e59c211ef9" path="/var/lib/kubelet/pods/17dcf6d7-edd8-49ea-8bd3-83e59c211ef9/volumes" Sep 30 14:00:06 crc kubenswrapper[4936]: I0930 14:00:06.508256 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"54f05d80-47b0-406c-a0be-856756410f2a","Type":"ContainerStarted","Data":"16ef66c85eb3e0e917faf82cec62ab5050a83d7cfd3d4a1332eb028981025262"} Sep 30 14:00:06 crc kubenswrapper[4936]: I0930 14:00:06.518599 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 14:00:06 crc kubenswrapper[4936]: I0930 14:00:06.530600 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.53057771 podStartE2EDuration="2.53057771s" podCreationTimestamp="2025-09-30 14:00:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:06.528033931 +0000 UTC m=+1256.912036252" watchObservedRunningTime="2025-09-30 14:00:06.53057771 +0000 UTC m=+1256.914580011" Sep 30 14:00:08 crc kubenswrapper[4936]: I0930 14:00:08.701110 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 14:00:08 crc kubenswrapper[4936]: I0930 14:00:08.701187 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 14:00:08 crc kubenswrapper[4936]: I0930 14:00:08.701927 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 14:00:08 crc kubenswrapper[4936]: I0930 14:00:08.701948 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 14:00:08 crc kubenswrapper[4936]: I0930 14:00:08.705685 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 14:00:08 crc kubenswrapper[4936]: I0930 14:00:08.712771 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 14:00:08 crc kubenswrapper[4936]: I0930 14:00:08.925224 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-jkmjc"] Sep 30 14:00:08 crc kubenswrapper[4936]: I0930 14:00:08.930779 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:00:08 crc kubenswrapper[4936]: I0930 14:00:08.957592 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-jkmjc"] Sep 30 14:00:08 crc kubenswrapper[4936]: I0930 14:00:08.990656 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-jkmjc\" (UID: \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\") " pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:00:08 crc kubenswrapper[4936]: I0930 14:00:08.990722 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-jkmjc\" (UID: \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\") " pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:00:08 crc kubenswrapper[4936]: I0930 14:00:08.990775 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q9ph\" (UniqueName: \"kubernetes.io/projected/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-kube-api-access-2q9ph\") pod \"dnsmasq-dns-5b856c5697-jkmjc\" (UID: \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\") " pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:00:08 crc kubenswrapper[4936]: I0930 14:00:08.990927 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-dns-svc\") pod \"dnsmasq-dns-5b856c5697-jkmjc\" (UID: \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\") " pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:00:08 crc kubenswrapper[4936]: I0930 14:00:08.990953 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-config\") pod \"dnsmasq-dns-5b856c5697-jkmjc\" (UID: \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\") " pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:00:09 crc kubenswrapper[4936]: I0930 14:00:09.094391 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-dns-svc\") pod \"dnsmasq-dns-5b856c5697-jkmjc\" (UID: \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\") " pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:00:09 crc kubenswrapper[4936]: I0930 14:00:09.094442 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-config\") pod \"dnsmasq-dns-5b856c5697-jkmjc\" (UID: \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\") " pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:00:09 crc kubenswrapper[4936]: I0930 14:00:09.094502 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-jkmjc\" (UID: \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\") " pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:00:09 crc kubenswrapper[4936]: I0930 14:00:09.094536 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-jkmjc\" (UID: \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\") " pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:00:09 crc kubenswrapper[4936]: I0930 14:00:09.094565 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q9ph\" (UniqueName: \"kubernetes.io/projected/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-kube-api-access-2q9ph\") pod \"dnsmasq-dns-5b856c5697-jkmjc\" (UID: \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\") " pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:00:09 crc kubenswrapper[4936]: I0930 14:00:09.095988 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-dns-svc\") pod \"dnsmasq-dns-5b856c5697-jkmjc\" (UID: \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\") " pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:00:09 crc kubenswrapper[4936]: I0930 14:00:09.096602 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-config\") pod \"dnsmasq-dns-5b856c5697-jkmjc\" (UID: \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\") " pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:00:09 crc kubenswrapper[4936]: I0930 14:00:09.097173 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-jkmjc\" (UID: \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\") " pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:00:09 crc kubenswrapper[4936]: I0930 14:00:09.097759 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-jkmjc\" (UID: \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\") " pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:00:09 crc kubenswrapper[4936]: I0930 14:00:09.126826 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q9ph\" (UniqueName: \"kubernetes.io/projected/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-kube-api-access-2q9ph\") pod \"dnsmasq-dns-5b856c5697-jkmjc\" (UID: \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\") " pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:00:09 crc kubenswrapper[4936]: I0930 14:00:09.200548 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:00:09 crc kubenswrapper[4936]: I0930 14:00:09.467963 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-jkmjc"] Sep 30 14:00:09 crc kubenswrapper[4936]: I0930 14:00:09.566318 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" event={"ID":"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e","Type":"ContainerStarted","Data":"951d5d505ffee6bd0df0e0acbf82980c51b354aaf87bb8912307d3f764531af6"} Sep 30 14:00:09 crc kubenswrapper[4936]: I0930 14:00:09.588067 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3311e389-86f6-468f-88aa-84c61d64b978","Type":"ContainerStarted","Data":"33683c973d689d5ab1ad084e4737baaaec4b0cce9d2f0c74312434a43a426a27"} Sep 30 14:00:09 crc kubenswrapper[4936]: I0930 14:00:09.588126 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 14:00:09 crc kubenswrapper[4936]: I0930 14:00:09.610918 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8226365 podStartE2EDuration="10.610900298s" podCreationTimestamp="2025-09-30 13:59:59 +0000 UTC" firstStartedPulling="2025-09-30 14:00:00.154273774 +0000 UTC m=+1250.538276075" lastFinishedPulling="2025-09-30 14:00:08.942537572 +0000 UTC m=+1259.326539873" observedRunningTime="2025-09-30 14:00:09.610662272 +0000 UTC m=+1259.994664583" watchObservedRunningTime="2025-09-30 14:00:09.610900298 +0000 UTC m=+1259.994902599" Sep 30 14:00:09 crc kubenswrapper[4936]: I0930 14:00:09.883248 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:10 crc kubenswrapper[4936]: I0930 14:00:10.611777 4936 generic.go:334] "Generic (PLEG): container finished" podID="9963394e-63e7-402f-8c8c-0a4a9b1f0e9e" containerID="2617c80c17839a861ff2fb3497bbc7deb07b8abf88ed30b2c5cbfca6788ca1be" exitCode=0 Sep 30 14:00:10 crc kubenswrapper[4936]: I0930 14:00:10.613650 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" event={"ID":"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e","Type":"ContainerDied","Data":"2617c80c17839a861ff2fb3497bbc7deb07b8abf88ed30b2c5cbfca6788ca1be"} Sep 30 14:00:10 crc kubenswrapper[4936]: I0930 14:00:10.757282 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:00:11 crc kubenswrapper[4936]: I0930 14:00:11.622839 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" event={"ID":"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e","Type":"ContainerStarted","Data":"a5d810c6d3904389881d6c3c64e729fa908389da3275cb0713b12784c0504c75"} Sep 30 14:00:11 crc kubenswrapper[4936]: I0930 14:00:11.648295 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" podStartSLOduration=3.648272258 podStartE2EDuration="3.648272258s" podCreationTimestamp="2025-09-30 14:00:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:11.646635763 +0000 UTC m=+1262.030638094" watchObservedRunningTime="2025-09-30 14:00:11.648272258 +0000 UTC m=+1262.032274559" Sep 30 14:00:11 crc kubenswrapper[4936]: I0930 14:00:11.938005 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:00:11 crc kubenswrapper[4936]: I0930 14:00:11.938226 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d31fd98d-f23c-45f4-88c2-0355b99d0114" containerName="nova-api-log" containerID="cri-o://b153a7583cba771c1804dbd62d3b5dc7bcc4f41738ecd3161a43e2144d933854" gracePeriod=30 Sep 30 14:00:11 crc kubenswrapper[4936]: I0930 14:00:11.939622 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d31fd98d-f23c-45f4-88c2-0355b99d0114" containerName="nova-api-api" containerID="cri-o://c562c862a2adff562e4b5fbf4362942efde2061c0277077d5b447335156cdfab" gracePeriod=30 Sep 30 14:00:12 crc kubenswrapper[4936]: I0930 14:00:12.634143 4936 generic.go:334] "Generic (PLEG): container finished" podID="d31fd98d-f23c-45f4-88c2-0355b99d0114" containerID="b153a7583cba771c1804dbd62d3b5dc7bcc4f41738ecd3161a43e2144d933854" exitCode=143 Sep 30 14:00:12 crc kubenswrapper[4936]: I0930 14:00:12.634235 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d31fd98d-f23c-45f4-88c2-0355b99d0114","Type":"ContainerDied","Data":"b153a7583cba771c1804dbd62d3b5dc7bcc4f41738ecd3161a43e2144d933854"} Sep 30 14:00:12 crc kubenswrapper[4936]: I0930 14:00:12.634576 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:00:12 crc kubenswrapper[4936]: I0930 14:00:12.634603 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3311e389-86f6-468f-88aa-84c61d64b978" containerName="ceilometer-central-agent" containerID="cri-o://948828a677004030585a0e396a8dbdebcb84decd8a4d82eb67f8b646340c176d" gracePeriod=30 Sep 30 14:00:12 crc kubenswrapper[4936]: I0930 14:00:12.634681 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3311e389-86f6-468f-88aa-84c61d64b978" containerName="ceilometer-notification-agent" containerID="cri-o://279419ad4ef00b21eb87bc7522aa51d1c434015cd81d6daf7bb8030725293f62" gracePeriod=30 Sep 30 14:00:12 crc kubenswrapper[4936]: I0930 14:00:12.634674 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3311e389-86f6-468f-88aa-84c61d64b978" containerName="sg-core" containerID="cri-o://b5406d385eb72b8eecad7f78d1f228d072d5dce181eed270562721c712f5ad88" gracePeriod=30 Sep 30 14:00:12 crc kubenswrapper[4936]: I0930 14:00:12.634681 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3311e389-86f6-468f-88aa-84c61d64b978" containerName="proxy-httpd" containerID="cri-o://33683c973d689d5ab1ad084e4737baaaec4b0cce9d2f0c74312434a43a426a27" gracePeriod=30 Sep 30 14:00:13 crc kubenswrapper[4936]: I0930 14:00:13.643658 4936 generic.go:334] "Generic (PLEG): container finished" podID="3311e389-86f6-468f-88aa-84c61d64b978" containerID="33683c973d689d5ab1ad084e4737baaaec4b0cce9d2f0c74312434a43a426a27" exitCode=0 Sep 30 14:00:13 crc kubenswrapper[4936]: I0930 14:00:13.644002 4936 generic.go:334] "Generic (PLEG): container finished" podID="3311e389-86f6-468f-88aa-84c61d64b978" containerID="b5406d385eb72b8eecad7f78d1f228d072d5dce181eed270562721c712f5ad88" exitCode=2 Sep 30 14:00:13 crc kubenswrapper[4936]: I0930 14:00:13.644016 4936 generic.go:334] "Generic (PLEG): container finished" podID="3311e389-86f6-468f-88aa-84c61d64b978" containerID="948828a677004030585a0e396a8dbdebcb84decd8a4d82eb67f8b646340c176d" exitCode=0 Sep 30 14:00:13 crc kubenswrapper[4936]: I0930 14:00:13.643709 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3311e389-86f6-468f-88aa-84c61d64b978","Type":"ContainerDied","Data":"33683c973d689d5ab1ad084e4737baaaec4b0cce9d2f0c74312434a43a426a27"} Sep 30 14:00:13 crc kubenswrapper[4936]: I0930 14:00:13.644364 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3311e389-86f6-468f-88aa-84c61d64b978","Type":"ContainerDied","Data":"b5406d385eb72b8eecad7f78d1f228d072d5dce181eed270562721c712f5ad88"} Sep 30 14:00:13 crc kubenswrapper[4936]: I0930 14:00:13.644414 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3311e389-86f6-468f-88aa-84c61d64b978","Type":"ContainerDied","Data":"948828a677004030585a0e396a8dbdebcb84decd8a4d82eb67f8b646340c176d"} Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.314524 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.425868 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tngk9\" (UniqueName: \"kubernetes.io/projected/3311e389-86f6-468f-88aa-84c61d64b978-kube-api-access-tngk9\") pod \"3311e389-86f6-468f-88aa-84c61d64b978\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.426014 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-sg-core-conf-yaml\") pod \"3311e389-86f6-468f-88aa-84c61d64b978\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.426057 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3311e389-86f6-468f-88aa-84c61d64b978-log-httpd\") pod \"3311e389-86f6-468f-88aa-84c61d64b978\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.426167 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-config-data\") pod \"3311e389-86f6-468f-88aa-84c61d64b978\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.426190 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-combined-ca-bundle\") pod \"3311e389-86f6-468f-88aa-84c61d64b978\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.426208 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-scripts\") pod \"3311e389-86f6-468f-88aa-84c61d64b978\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.426233 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3311e389-86f6-468f-88aa-84c61d64b978-run-httpd\") pod \"3311e389-86f6-468f-88aa-84c61d64b978\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.426299 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-ceilometer-tls-certs\") pod \"3311e389-86f6-468f-88aa-84c61d64b978\" (UID: \"3311e389-86f6-468f-88aa-84c61d64b978\") " Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.431540 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3311e389-86f6-468f-88aa-84c61d64b978-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3311e389-86f6-468f-88aa-84c61d64b978" (UID: "3311e389-86f6-468f-88aa-84c61d64b978"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.456352 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3311e389-86f6-468f-88aa-84c61d64b978-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3311e389-86f6-468f-88aa-84c61d64b978" (UID: "3311e389-86f6-468f-88aa-84c61d64b978"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.457620 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3311e389-86f6-468f-88aa-84c61d64b978-kube-api-access-tngk9" (OuterVolumeSpecName: "kube-api-access-tngk9") pod "3311e389-86f6-468f-88aa-84c61d64b978" (UID: "3311e389-86f6-468f-88aa-84c61d64b978"). InnerVolumeSpecName "kube-api-access-tngk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.494513 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3311e389-86f6-468f-88aa-84c61d64b978" (UID: "3311e389-86f6-468f-88aa-84c61d64b978"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.499108 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-scripts" (OuterVolumeSpecName: "scripts") pod "3311e389-86f6-468f-88aa-84c61d64b978" (UID: "3311e389-86f6-468f-88aa-84c61d64b978"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.529282 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tngk9\" (UniqueName: \"kubernetes.io/projected/3311e389-86f6-468f-88aa-84c61d64b978-kube-api-access-tngk9\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.529314 4936 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.529327 4936 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3311e389-86f6-468f-88aa-84c61d64b978-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.529357 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.529369 4936 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3311e389-86f6-468f-88aa-84c61d64b978-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.530618 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3311e389-86f6-468f-88aa-84c61d64b978" (UID: "3311e389-86f6-468f-88aa-84c61d64b978"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.552752 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3311e389-86f6-468f-88aa-84c61d64b978" (UID: "3311e389-86f6-468f-88aa-84c61d64b978"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.597651 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-config-data" (OuterVolumeSpecName: "config-data") pod "3311e389-86f6-468f-88aa-84c61d64b978" (UID: "3311e389-86f6-468f-88aa-84c61d64b978"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.630652 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.630859 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.630957 4936 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3311e389-86f6-468f-88aa-84c61d64b978-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.655556 4936 generic.go:334] "Generic (PLEG): container finished" podID="3311e389-86f6-468f-88aa-84c61d64b978" containerID="279419ad4ef00b21eb87bc7522aa51d1c434015cd81d6daf7bb8030725293f62" exitCode=0 Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.655665 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.655684 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3311e389-86f6-468f-88aa-84c61d64b978","Type":"ContainerDied","Data":"279419ad4ef00b21eb87bc7522aa51d1c434015cd81d6daf7bb8030725293f62"} Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.657184 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3311e389-86f6-468f-88aa-84c61d64b978","Type":"ContainerDied","Data":"f369bb1138af610eff59ecd2801911cafcc93a6f2d997f73d2003db36902cf8c"} Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.657208 4936 scope.go:117] "RemoveContainer" containerID="33683c973d689d5ab1ad084e4737baaaec4b0cce9d2f0c74312434a43a426a27" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.677206 4936 scope.go:117] "RemoveContainer" containerID="b5406d385eb72b8eecad7f78d1f228d072d5dce181eed270562721c712f5ad88" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.693445 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.701303 4936 scope.go:117] "RemoveContainer" containerID="279419ad4ef00b21eb87bc7522aa51d1c434015cd81d6daf7bb8030725293f62" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.708277 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.717719 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:00:14 crc kubenswrapper[4936]: E0930 14:00:14.718141 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3311e389-86f6-468f-88aa-84c61d64b978" containerName="sg-core" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.718161 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3311e389-86f6-468f-88aa-84c61d64b978" containerName="sg-core" Sep 30 14:00:14 crc kubenswrapper[4936]: E0930 14:00:14.718199 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3311e389-86f6-468f-88aa-84c61d64b978" containerName="proxy-httpd" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.718208 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3311e389-86f6-468f-88aa-84c61d64b978" containerName="proxy-httpd" Sep 30 14:00:14 crc kubenswrapper[4936]: E0930 14:00:14.718224 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3311e389-86f6-468f-88aa-84c61d64b978" containerName="ceilometer-central-agent" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.718231 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3311e389-86f6-468f-88aa-84c61d64b978" containerName="ceilometer-central-agent" Sep 30 14:00:14 crc kubenswrapper[4936]: E0930 14:00:14.718250 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3311e389-86f6-468f-88aa-84c61d64b978" containerName="ceilometer-notification-agent" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.718258 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3311e389-86f6-468f-88aa-84c61d64b978" containerName="ceilometer-notification-agent" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.718520 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="3311e389-86f6-468f-88aa-84c61d64b978" containerName="sg-core" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.718540 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="3311e389-86f6-468f-88aa-84c61d64b978" containerName="ceilometer-central-agent" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.718563 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="3311e389-86f6-468f-88aa-84c61d64b978" containerName="proxy-httpd" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.718578 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="3311e389-86f6-468f-88aa-84c61d64b978" containerName="ceilometer-notification-agent" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.720467 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.731966 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.733431 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.733698 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.733728 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.744192 4936 scope.go:117] "RemoveContainer" containerID="948828a677004030585a0e396a8dbdebcb84decd8a4d82eb67f8b646340c176d" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.808395 4936 scope.go:117] "RemoveContainer" containerID="33683c973d689d5ab1ad084e4737baaaec4b0cce9d2f0c74312434a43a426a27" Sep 30 14:00:14 crc kubenswrapper[4936]: E0930 14:00:14.809839 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33683c973d689d5ab1ad084e4737baaaec4b0cce9d2f0c74312434a43a426a27\": container with ID starting with 33683c973d689d5ab1ad084e4737baaaec4b0cce9d2f0c74312434a43a426a27 not found: ID does not exist" containerID="33683c973d689d5ab1ad084e4737baaaec4b0cce9d2f0c74312434a43a426a27" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.809877 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33683c973d689d5ab1ad084e4737baaaec4b0cce9d2f0c74312434a43a426a27"} err="failed to get container status \"33683c973d689d5ab1ad084e4737baaaec4b0cce9d2f0c74312434a43a426a27\": rpc error: code = NotFound desc = could not find container \"33683c973d689d5ab1ad084e4737baaaec4b0cce9d2f0c74312434a43a426a27\": container with ID starting with 33683c973d689d5ab1ad084e4737baaaec4b0cce9d2f0c74312434a43a426a27 not found: ID does not exist" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.809897 4936 scope.go:117] "RemoveContainer" containerID="b5406d385eb72b8eecad7f78d1f228d072d5dce181eed270562721c712f5ad88" Sep 30 14:00:14 crc kubenswrapper[4936]: E0930 14:00:14.810285 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5406d385eb72b8eecad7f78d1f228d072d5dce181eed270562721c712f5ad88\": container with ID starting with b5406d385eb72b8eecad7f78d1f228d072d5dce181eed270562721c712f5ad88 not found: ID does not exist" containerID="b5406d385eb72b8eecad7f78d1f228d072d5dce181eed270562721c712f5ad88" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.810417 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5406d385eb72b8eecad7f78d1f228d072d5dce181eed270562721c712f5ad88"} err="failed to get container status \"b5406d385eb72b8eecad7f78d1f228d072d5dce181eed270562721c712f5ad88\": rpc error: code = NotFound desc = could not find container \"b5406d385eb72b8eecad7f78d1f228d072d5dce181eed270562721c712f5ad88\": container with ID starting with b5406d385eb72b8eecad7f78d1f228d072d5dce181eed270562721c712f5ad88 not found: ID does not exist" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.810453 4936 scope.go:117] "RemoveContainer" containerID="279419ad4ef00b21eb87bc7522aa51d1c434015cd81d6daf7bb8030725293f62" Sep 30 14:00:14 crc kubenswrapper[4936]: E0930 14:00:14.811054 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"279419ad4ef00b21eb87bc7522aa51d1c434015cd81d6daf7bb8030725293f62\": container with ID starting with 279419ad4ef00b21eb87bc7522aa51d1c434015cd81d6daf7bb8030725293f62 not found: ID does not exist" containerID="279419ad4ef00b21eb87bc7522aa51d1c434015cd81d6daf7bb8030725293f62" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.811078 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"279419ad4ef00b21eb87bc7522aa51d1c434015cd81d6daf7bb8030725293f62"} err="failed to get container status \"279419ad4ef00b21eb87bc7522aa51d1c434015cd81d6daf7bb8030725293f62\": rpc error: code = NotFound desc = could not find container \"279419ad4ef00b21eb87bc7522aa51d1c434015cd81d6daf7bb8030725293f62\": container with ID starting with 279419ad4ef00b21eb87bc7522aa51d1c434015cd81d6daf7bb8030725293f62 not found: ID does not exist" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.811184 4936 scope.go:117] "RemoveContainer" containerID="948828a677004030585a0e396a8dbdebcb84decd8a4d82eb67f8b646340c176d" Sep 30 14:00:14 crc kubenswrapper[4936]: E0930 14:00:14.811439 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"948828a677004030585a0e396a8dbdebcb84decd8a4d82eb67f8b646340c176d\": container with ID starting with 948828a677004030585a0e396a8dbdebcb84decd8a4d82eb67f8b646340c176d not found: ID does not exist" containerID="948828a677004030585a0e396a8dbdebcb84decd8a4d82eb67f8b646340c176d" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.811466 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"948828a677004030585a0e396a8dbdebcb84decd8a4d82eb67f8b646340c176d"} err="failed to get container status \"948828a677004030585a0e396a8dbdebcb84decd8a4d82eb67f8b646340c176d\": rpc error: code = NotFound desc = could not find container \"948828a677004030585a0e396a8dbdebcb84decd8a4d82eb67f8b646340c176d\": container with ID starting with 948828a677004030585a0e396a8dbdebcb84decd8a4d82eb67f8b646340c176d not found: ID does not exist" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.835088 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.835158 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.835188 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-run-httpd\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.835221 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-config-data\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.835256 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-scripts\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.835284 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-log-httpd\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.835417 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.835532 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ql5w\" (UniqueName: \"kubernetes.io/projected/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-kube-api-access-5ql5w\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.883944 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.900976 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.936794 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-scripts\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.936838 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-log-httpd\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.936862 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.936912 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ql5w\" (UniqueName: \"kubernetes.io/projected/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-kube-api-access-5ql5w\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.937028 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.937057 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.937080 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-run-httpd\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.937106 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-config-data\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.939020 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-run-httpd\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.939277 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-log-httpd\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.941993 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.942845 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-config-data\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.943847 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.944260 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-scripts\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.953062 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:14 crc kubenswrapper[4936]: I0930 14:00:14.957391 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ql5w\" (UniqueName: \"kubernetes.io/projected/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-kube-api-access-5ql5w\") pod \"ceilometer-0\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " pod="openstack/ceilometer-0" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.042777 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.499035 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.562963 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.649808 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d31fd98d-f23c-45f4-88c2-0355b99d0114-logs\") pod \"d31fd98d-f23c-45f4-88c2-0355b99d0114\" (UID: \"d31fd98d-f23c-45f4-88c2-0355b99d0114\") " Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.649907 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d31fd98d-f23c-45f4-88c2-0355b99d0114-config-data\") pod \"d31fd98d-f23c-45f4-88c2-0355b99d0114\" (UID: \"d31fd98d-f23c-45f4-88c2-0355b99d0114\") " Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.649999 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d31fd98d-f23c-45f4-88c2-0355b99d0114-combined-ca-bundle\") pod \"d31fd98d-f23c-45f4-88c2-0355b99d0114\" (UID: \"d31fd98d-f23c-45f4-88c2-0355b99d0114\") " Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.650028 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dj9p\" (UniqueName: \"kubernetes.io/projected/d31fd98d-f23c-45f4-88c2-0355b99d0114-kube-api-access-7dj9p\") pod \"d31fd98d-f23c-45f4-88c2-0355b99d0114\" (UID: \"d31fd98d-f23c-45f4-88c2-0355b99d0114\") " Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.652383 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d31fd98d-f23c-45f4-88c2-0355b99d0114-logs" (OuterVolumeSpecName: "logs") pod "d31fd98d-f23c-45f4-88c2-0355b99d0114" (UID: "d31fd98d-f23c-45f4-88c2-0355b99d0114"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.668193 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d31fd98d-f23c-45f4-88c2-0355b99d0114-kube-api-access-7dj9p" (OuterVolumeSpecName: "kube-api-access-7dj9p") pod "d31fd98d-f23c-45f4-88c2-0355b99d0114" (UID: "d31fd98d-f23c-45f4-88c2-0355b99d0114"). InnerVolumeSpecName "kube-api-access-7dj9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.674178 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31","Type":"ContainerStarted","Data":"ac4d2539fd9953e9cf3c49fde4ef7561dd6a73d3bdf94ec7216a97350eb960f4"} Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.679074 4936 generic.go:334] "Generic (PLEG): container finished" podID="d31fd98d-f23c-45f4-88c2-0355b99d0114" containerID="c562c862a2adff562e4b5fbf4362942efde2061c0277077d5b447335156cdfab" exitCode=0 Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.679191 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.679203 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d31fd98d-f23c-45f4-88c2-0355b99d0114","Type":"ContainerDied","Data":"c562c862a2adff562e4b5fbf4362942efde2061c0277077d5b447335156cdfab"} Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.679237 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d31fd98d-f23c-45f4-88c2-0355b99d0114","Type":"ContainerDied","Data":"882455403eae3835d319e84067c162b1ec11d3f031fcc13445af54bad57b3923"} Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.679258 4936 scope.go:117] "RemoveContainer" containerID="c562c862a2adff562e4b5fbf4362942efde2061c0277077d5b447335156cdfab" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.700972 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d31fd98d-f23c-45f4-88c2-0355b99d0114-config-data" (OuterVolumeSpecName: "config-data") pod "d31fd98d-f23c-45f4-88c2-0355b99d0114" (UID: "d31fd98d-f23c-45f4-88c2-0355b99d0114"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.711622 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.719658 4936 scope.go:117] "RemoveContainer" containerID="b153a7583cba771c1804dbd62d3b5dc7bcc4f41738ecd3161a43e2144d933854" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.755581 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d31fd98d-f23c-45f4-88c2-0355b99d0114-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d31fd98d-f23c-45f4-88c2-0355b99d0114" (UID: "d31fd98d-f23c-45f4-88c2-0355b99d0114"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.756623 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d31fd98d-f23c-45f4-88c2-0355b99d0114-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.756647 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d31fd98d-f23c-45f4-88c2-0355b99d0114-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.756658 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dj9p\" (UniqueName: \"kubernetes.io/projected/d31fd98d-f23c-45f4-88c2-0355b99d0114-kube-api-access-7dj9p\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.756675 4936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d31fd98d-f23c-45f4-88c2-0355b99d0114-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.766996 4936 scope.go:117] "RemoveContainer" containerID="c562c862a2adff562e4b5fbf4362942efde2061c0277077d5b447335156cdfab" Sep 30 14:00:15 crc kubenswrapper[4936]: E0930 14:00:15.767678 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c562c862a2adff562e4b5fbf4362942efde2061c0277077d5b447335156cdfab\": container with ID starting with c562c862a2adff562e4b5fbf4362942efde2061c0277077d5b447335156cdfab not found: ID does not exist" containerID="c562c862a2adff562e4b5fbf4362942efde2061c0277077d5b447335156cdfab" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.767719 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c562c862a2adff562e4b5fbf4362942efde2061c0277077d5b447335156cdfab"} err="failed to get container status \"c562c862a2adff562e4b5fbf4362942efde2061c0277077d5b447335156cdfab\": rpc error: code = NotFound desc = could not find container \"c562c862a2adff562e4b5fbf4362942efde2061c0277077d5b447335156cdfab\": container with ID starting with c562c862a2adff562e4b5fbf4362942efde2061c0277077d5b447335156cdfab not found: ID does not exist" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.767747 4936 scope.go:117] "RemoveContainer" containerID="b153a7583cba771c1804dbd62d3b5dc7bcc4f41738ecd3161a43e2144d933854" Sep 30 14:00:15 crc kubenswrapper[4936]: E0930 14:00:15.768096 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b153a7583cba771c1804dbd62d3b5dc7bcc4f41738ecd3161a43e2144d933854\": container with ID starting with b153a7583cba771c1804dbd62d3b5dc7bcc4f41738ecd3161a43e2144d933854 not found: ID does not exist" containerID="b153a7583cba771c1804dbd62d3b5dc7bcc4f41738ecd3161a43e2144d933854" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.768122 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b153a7583cba771c1804dbd62d3b5dc7bcc4f41738ecd3161a43e2144d933854"} err="failed to get container status \"b153a7583cba771c1804dbd62d3b5dc7bcc4f41738ecd3161a43e2144d933854\": rpc error: code = NotFound desc = could not find container \"b153a7583cba771c1804dbd62d3b5dc7bcc4f41738ecd3161a43e2144d933854\": container with ID starting with b153a7583cba771c1804dbd62d3b5dc7bcc4f41738ecd3161a43e2144d933854 not found: ID does not exist" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.899631 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-5npq5"] Sep 30 14:00:15 crc kubenswrapper[4936]: E0930 14:00:15.900009 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d31fd98d-f23c-45f4-88c2-0355b99d0114" containerName="nova-api-log" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.900023 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="d31fd98d-f23c-45f4-88c2-0355b99d0114" containerName="nova-api-log" Sep 30 14:00:15 crc kubenswrapper[4936]: E0930 14:00:15.900057 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d31fd98d-f23c-45f4-88c2-0355b99d0114" containerName="nova-api-api" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.900063 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="d31fd98d-f23c-45f4-88c2-0355b99d0114" containerName="nova-api-api" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.900235 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="d31fd98d-f23c-45f4-88c2-0355b99d0114" containerName="nova-api-log" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.900252 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="d31fd98d-f23c-45f4-88c2-0355b99d0114" containerName="nova-api-api" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.900873 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5npq5" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.906833 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.906960 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.934862 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5npq5"] Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.959349 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb6eb7d7-1462-435c-b651-7c28fbc5256c-scripts\") pod \"nova-cell1-cell-mapping-5npq5\" (UID: \"cb6eb7d7-1462-435c-b651-7c28fbc5256c\") " pod="openstack/nova-cell1-cell-mapping-5npq5" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.959392 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6eb7d7-1462-435c-b651-7c28fbc5256c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5npq5\" (UID: \"cb6eb7d7-1462-435c-b651-7c28fbc5256c\") " pod="openstack/nova-cell1-cell-mapping-5npq5" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.959466 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb6eb7d7-1462-435c-b651-7c28fbc5256c-config-data\") pod \"nova-cell1-cell-mapping-5npq5\" (UID: \"cb6eb7d7-1462-435c-b651-7c28fbc5256c\") " pod="openstack/nova-cell1-cell-mapping-5npq5" Sep 30 14:00:15 crc kubenswrapper[4936]: I0930 14:00:15.959536 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2t48\" (UniqueName: \"kubernetes.io/projected/cb6eb7d7-1462-435c-b651-7c28fbc5256c-kube-api-access-g2t48\") pod \"nova-cell1-cell-mapping-5npq5\" (UID: \"cb6eb7d7-1462-435c-b651-7c28fbc5256c\") " pod="openstack/nova-cell1-cell-mapping-5npq5" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.061170 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb6eb7d7-1462-435c-b651-7c28fbc5256c-config-data\") pod \"nova-cell1-cell-mapping-5npq5\" (UID: \"cb6eb7d7-1462-435c-b651-7c28fbc5256c\") " pod="openstack/nova-cell1-cell-mapping-5npq5" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.061530 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2t48\" (UniqueName: \"kubernetes.io/projected/cb6eb7d7-1462-435c-b651-7c28fbc5256c-kube-api-access-g2t48\") pod \"nova-cell1-cell-mapping-5npq5\" (UID: \"cb6eb7d7-1462-435c-b651-7c28fbc5256c\") " pod="openstack/nova-cell1-cell-mapping-5npq5" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.061611 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb6eb7d7-1462-435c-b651-7c28fbc5256c-scripts\") pod \"nova-cell1-cell-mapping-5npq5\" (UID: \"cb6eb7d7-1462-435c-b651-7c28fbc5256c\") " pod="openstack/nova-cell1-cell-mapping-5npq5" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.061628 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6eb7d7-1462-435c-b651-7c28fbc5256c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5npq5\" (UID: \"cb6eb7d7-1462-435c-b651-7c28fbc5256c\") " pod="openstack/nova-cell1-cell-mapping-5npq5" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.067056 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb6eb7d7-1462-435c-b651-7c28fbc5256c-scripts\") pod \"nova-cell1-cell-mapping-5npq5\" (UID: \"cb6eb7d7-1462-435c-b651-7c28fbc5256c\") " pod="openstack/nova-cell1-cell-mapping-5npq5" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.067117 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6eb7d7-1462-435c-b651-7c28fbc5256c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5npq5\" (UID: \"cb6eb7d7-1462-435c-b651-7c28fbc5256c\") " pod="openstack/nova-cell1-cell-mapping-5npq5" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.076314 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.077347 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.077594 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb6eb7d7-1462-435c-b651-7c28fbc5256c-config-data\") pod \"nova-cell1-cell-mapping-5npq5\" (UID: \"cb6eb7d7-1462-435c-b651-7c28fbc5256c\") " pod="openstack/nova-cell1-cell-mapping-5npq5" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.120416 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.122352 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.126538 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.131821 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.132041 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.132210 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.134913 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2t48\" (UniqueName: \"kubernetes.io/projected/cb6eb7d7-1462-435c-b651-7c28fbc5256c-kube-api-access-g2t48\") pod \"nova-cell1-cell-mapping-5npq5\" (UID: \"cb6eb7d7-1462-435c-b651-7c28fbc5256c\") " pod="openstack/nova-cell1-cell-mapping-5npq5" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.164604 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b536127c-3a9c-40ce-9c7f-45485f8ba076-logs\") pod \"nova-api-0\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " pod="openstack/nova-api-0" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.164699 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " pod="openstack/nova-api-0" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.164770 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " pod="openstack/nova-api-0" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.164793 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-config-data\") pod \"nova-api-0\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " pod="openstack/nova-api-0" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.164814 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xqrt\" (UniqueName: \"kubernetes.io/projected/b536127c-3a9c-40ce-9c7f-45485f8ba076-kube-api-access-9xqrt\") pod \"nova-api-0\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " pod="openstack/nova-api-0" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.164835 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-public-tls-certs\") pod \"nova-api-0\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " pod="openstack/nova-api-0" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.232234 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5npq5" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.266865 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " pod="openstack/nova-api-0" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.267020 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " pod="openstack/nova-api-0" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.267061 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-config-data\") pod \"nova-api-0\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " pod="openstack/nova-api-0" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.267092 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xqrt\" (UniqueName: \"kubernetes.io/projected/b536127c-3a9c-40ce-9c7f-45485f8ba076-kube-api-access-9xqrt\") pod \"nova-api-0\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " pod="openstack/nova-api-0" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.267127 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-public-tls-certs\") pod \"nova-api-0\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " pod="openstack/nova-api-0" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.267213 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b536127c-3a9c-40ce-9c7f-45485f8ba076-logs\") pod \"nova-api-0\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " pod="openstack/nova-api-0" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.267701 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b536127c-3a9c-40ce-9c7f-45485f8ba076-logs\") pod \"nova-api-0\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " pod="openstack/nova-api-0" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.283026 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " pod="openstack/nova-api-0" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.283439 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-config-data\") pod \"nova-api-0\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " pod="openstack/nova-api-0" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.283773 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " pod="openstack/nova-api-0" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.288875 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-public-tls-certs\") pod \"nova-api-0\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " pod="openstack/nova-api-0" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.291603 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xqrt\" (UniqueName: \"kubernetes.io/projected/b536127c-3a9c-40ce-9c7f-45485f8ba076-kube-api-access-9xqrt\") pod \"nova-api-0\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " pod="openstack/nova-api-0" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.346320 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3311e389-86f6-468f-88aa-84c61d64b978" path="/var/lib/kubelet/pods/3311e389-86f6-468f-88aa-84c61d64b978/volumes" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.347877 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d31fd98d-f23c-45f4-88c2-0355b99d0114" path="/var/lib/kubelet/pods/d31fd98d-f23c-45f4-88c2-0355b99d0114/volumes" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.541867 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.714048 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31","Type":"ContainerStarted","Data":"31155e3de1ebb1e32246611383b79d224d86f23aba7b0fcd82434a6212289502"} Sep 30 14:00:16 crc kubenswrapper[4936]: I0930 14:00:16.760685 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5npq5"] Sep 30 14:00:17 crc kubenswrapper[4936]: I0930 14:00:17.102492 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:00:17 crc kubenswrapper[4936]: I0930 14:00:17.750288 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31","Type":"ContainerStarted","Data":"a26cec4444b1ccba15c6d11da9da04b69f2847ae56603f995eed9feaef2049b1"} Sep 30 14:00:17 crc kubenswrapper[4936]: I0930 14:00:17.762133 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5npq5" event={"ID":"cb6eb7d7-1462-435c-b651-7c28fbc5256c","Type":"ContainerStarted","Data":"305017129b05b760bf688c9f231a690a3c5a17e6794fc81a822ccbdd493b299b"} Sep 30 14:00:17 crc kubenswrapper[4936]: I0930 14:00:17.762188 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5npq5" event={"ID":"cb6eb7d7-1462-435c-b651-7c28fbc5256c","Type":"ContainerStarted","Data":"ab7b8f8123052b31ea0edbce5e08383272da4e532897809a458f21d98708f79d"} Sep 30 14:00:17 crc kubenswrapper[4936]: I0930 14:00:17.763878 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b536127c-3a9c-40ce-9c7f-45485f8ba076","Type":"ContainerStarted","Data":"6bf24c92495a136fb8543b9120654133e2f8eb9c2465872dfc3240f3c76b9ad9"} Sep 30 14:00:17 crc kubenswrapper[4936]: I0930 14:00:17.763906 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b536127c-3a9c-40ce-9c7f-45485f8ba076","Type":"ContainerStarted","Data":"09870d53d285f4d721875d5a7c36fc0fe8018459fcd8225c0982260a1139bfe6"} Sep 30 14:00:17 crc kubenswrapper[4936]: I0930 14:00:17.763916 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b536127c-3a9c-40ce-9c7f-45485f8ba076","Type":"ContainerStarted","Data":"fd1d69135ad076813e37857b349a4c155cc8f48c55ac8d198e0a9268be67d15e"} Sep 30 14:00:17 crc kubenswrapper[4936]: I0930 14:00:17.803113 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-5npq5" podStartSLOduration=2.803082325 podStartE2EDuration="2.803082325s" podCreationTimestamp="2025-09-30 14:00:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:17.790961184 +0000 UTC m=+1268.174963485" watchObservedRunningTime="2025-09-30 14:00:17.803082325 +0000 UTC m=+1268.187084616" Sep 30 14:00:18 crc kubenswrapper[4936]: I0930 14:00:18.773915 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31","Type":"ContainerStarted","Data":"e22fa2826711a07486a9cff26bcd07352ea88f10436b201ddfbff8eaeb40c8ae"} Sep 30 14:00:19 crc kubenswrapper[4936]: I0930 14:00:19.202575 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:00:19 crc kubenswrapper[4936]: I0930 14:00:19.227347 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.227304077 podStartE2EDuration="3.227304077s" podCreationTimestamp="2025-09-30 14:00:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:17.863483505 +0000 UTC m=+1268.247485816" watchObservedRunningTime="2025-09-30 14:00:19.227304077 +0000 UTC m=+1269.611306378" Sep 30 14:00:19 crc kubenswrapper[4936]: I0930 14:00:19.273549 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-z47ww"] Sep 30 14:00:19 crc kubenswrapper[4936]: I0930 14:00:19.273794 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-566b5b7845-z47ww" podUID="ce0b046e-7c77-461b-ad62-1ece23a1225c" containerName="dnsmasq-dns" containerID="cri-o://1c6de2f662fa7dee0c7e3dcd4c79caa5be921e0d31d4c1f921fad4a09d24940b" gracePeriod=10 Sep 30 14:00:19 crc kubenswrapper[4936]: I0930 14:00:19.798514 4936 generic.go:334] "Generic (PLEG): container finished" podID="ce0b046e-7c77-461b-ad62-1ece23a1225c" containerID="1c6de2f662fa7dee0c7e3dcd4c79caa5be921e0d31d4c1f921fad4a09d24940b" exitCode=0 Sep 30 14:00:19 crc kubenswrapper[4936]: I0930 14:00:19.798865 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-z47ww" event={"ID":"ce0b046e-7c77-461b-ad62-1ece23a1225c","Type":"ContainerDied","Data":"1c6de2f662fa7dee0c7e3dcd4c79caa5be921e0d31d4c1f921fad4a09d24940b"} Sep 30 14:00:19 crc kubenswrapper[4936]: I0930 14:00:19.820754 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31","Type":"ContainerStarted","Data":"1dcfe9d384a17bede757c5fd2d6523a1742aa9b6df8337fa8d25aa7c8d932bab"} Sep 30 14:00:19 crc kubenswrapper[4936]: I0930 14:00:19.822094 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 14:00:19 crc kubenswrapper[4936]: I0930 14:00:19.870935 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.095570355 podStartE2EDuration="5.870916967s" podCreationTimestamp="2025-09-30 14:00:14 +0000 UTC" firstStartedPulling="2025-09-30 14:00:15.508506289 +0000 UTC m=+1265.892508580" lastFinishedPulling="2025-09-30 14:00:19.283852891 +0000 UTC m=+1269.667855192" observedRunningTime="2025-09-30 14:00:19.8607968 +0000 UTC m=+1270.244799121" watchObservedRunningTime="2025-09-30 14:00:19.870916967 +0000 UTC m=+1270.254919258" Sep 30 14:00:19 crc kubenswrapper[4936]: I0930 14:00:19.952995 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 14:00:19 crc kubenswrapper[4936]: I0930 14:00:19.962718 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-ovsdbserver-sb\") pod \"ce0b046e-7c77-461b-ad62-1ece23a1225c\" (UID: \"ce0b046e-7c77-461b-ad62-1ece23a1225c\") " Sep 30 14:00:20 crc kubenswrapper[4936]: I0930 14:00:20.022038 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce0b046e-7c77-461b-ad62-1ece23a1225c" (UID: "ce0b046e-7c77-461b-ad62-1ece23a1225c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:00:20 crc kubenswrapper[4936]: I0930 14:00:20.063755 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpwn2\" (UniqueName: \"kubernetes.io/projected/ce0b046e-7c77-461b-ad62-1ece23a1225c-kube-api-access-bpwn2\") pod \"ce0b046e-7c77-461b-ad62-1ece23a1225c\" (UID: \"ce0b046e-7c77-461b-ad62-1ece23a1225c\") " Sep 30 14:00:20 crc kubenswrapper[4936]: I0930 14:00:20.063939 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-config\") pod \"ce0b046e-7c77-461b-ad62-1ece23a1225c\" (UID: \"ce0b046e-7c77-461b-ad62-1ece23a1225c\") " Sep 30 14:00:20 crc kubenswrapper[4936]: I0930 14:00:20.063961 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-dns-svc\") pod \"ce0b046e-7c77-461b-ad62-1ece23a1225c\" (UID: \"ce0b046e-7c77-461b-ad62-1ece23a1225c\") " Sep 30 14:00:20 crc kubenswrapper[4936]: I0930 14:00:20.063990 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-ovsdbserver-nb\") pod \"ce0b046e-7c77-461b-ad62-1ece23a1225c\" (UID: \"ce0b046e-7c77-461b-ad62-1ece23a1225c\") " Sep 30 14:00:20 crc kubenswrapper[4936]: I0930 14:00:20.064355 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:20 crc kubenswrapper[4936]: I0930 14:00:20.080583 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce0b046e-7c77-461b-ad62-1ece23a1225c-kube-api-access-bpwn2" (OuterVolumeSpecName: "kube-api-access-bpwn2") pod "ce0b046e-7c77-461b-ad62-1ece23a1225c" (UID: "ce0b046e-7c77-461b-ad62-1ece23a1225c"). InnerVolumeSpecName "kube-api-access-bpwn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:00:20 crc kubenswrapper[4936]: I0930 14:00:20.120633 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce0b046e-7c77-461b-ad62-1ece23a1225c" (UID: "ce0b046e-7c77-461b-ad62-1ece23a1225c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:00:20 crc kubenswrapper[4936]: I0930 14:00:20.121455 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-config" (OuterVolumeSpecName: "config") pod "ce0b046e-7c77-461b-ad62-1ece23a1225c" (UID: "ce0b046e-7c77-461b-ad62-1ece23a1225c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:00:20 crc kubenswrapper[4936]: I0930 14:00:20.139942 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce0b046e-7c77-461b-ad62-1ece23a1225c" (UID: "ce0b046e-7c77-461b-ad62-1ece23a1225c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:00:20 crc kubenswrapper[4936]: I0930 14:00:20.165866 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpwn2\" (UniqueName: \"kubernetes.io/projected/ce0b046e-7c77-461b-ad62-1ece23a1225c-kube-api-access-bpwn2\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:20 crc kubenswrapper[4936]: I0930 14:00:20.165914 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:20 crc kubenswrapper[4936]: I0930 14:00:20.165923 4936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:20 crc kubenswrapper[4936]: I0930 14:00:20.165932 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce0b046e-7c77-461b-ad62-1ece23a1225c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:20 crc kubenswrapper[4936]: I0930 14:00:20.833004 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-z47ww" Sep 30 14:00:20 crc kubenswrapper[4936]: I0930 14:00:20.833887 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-z47ww" event={"ID":"ce0b046e-7c77-461b-ad62-1ece23a1225c","Type":"ContainerDied","Data":"329fba4d9645d125ddb64e04ffdc949e645be2a6cf374cdab0084cea690edf56"} Sep 30 14:00:20 crc kubenswrapper[4936]: I0930 14:00:20.833920 4936 scope.go:117] "RemoveContainer" containerID="1c6de2f662fa7dee0c7e3dcd4c79caa5be921e0d31d4c1f921fad4a09d24940b" Sep 30 14:00:20 crc kubenswrapper[4936]: I0930 14:00:20.864373 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-z47ww"] Sep 30 14:00:20 crc kubenswrapper[4936]: I0930 14:00:20.865423 4936 scope.go:117] "RemoveContainer" containerID="0037111496d5c4e942d4dd609ec00d06a8d1b35d7263467ec401fc53cb6f385f" Sep 30 14:00:20 crc kubenswrapper[4936]: I0930 14:00:20.885465 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-z47ww"] Sep 30 14:00:22 crc kubenswrapper[4936]: I0930 14:00:22.333075 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce0b046e-7c77-461b-ad62-1ece23a1225c" path="/var/lib/kubelet/pods/ce0b046e-7c77-461b-ad62-1ece23a1225c/volumes" Sep 30 14:00:23 crc kubenswrapper[4936]: I0930 14:00:23.856647 4936 generic.go:334] "Generic (PLEG): container finished" podID="cb6eb7d7-1462-435c-b651-7c28fbc5256c" containerID="305017129b05b760bf688c9f231a690a3c5a17e6794fc81a822ccbdd493b299b" exitCode=0 Sep 30 14:00:23 crc kubenswrapper[4936]: I0930 14:00:23.857942 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5npq5" event={"ID":"cb6eb7d7-1462-435c-b651-7c28fbc5256c","Type":"ContainerDied","Data":"305017129b05b760bf688c9f231a690a3c5a17e6794fc81a822ccbdd493b299b"} Sep 30 14:00:25 crc kubenswrapper[4936]: I0930 14:00:25.293108 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5npq5" Sep 30 14:00:25 crc kubenswrapper[4936]: I0930 14:00:25.489514 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb6eb7d7-1462-435c-b651-7c28fbc5256c-scripts\") pod \"cb6eb7d7-1462-435c-b651-7c28fbc5256c\" (UID: \"cb6eb7d7-1462-435c-b651-7c28fbc5256c\") " Sep 30 14:00:25 crc kubenswrapper[4936]: I0930 14:00:25.489579 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb6eb7d7-1462-435c-b651-7c28fbc5256c-config-data\") pod \"cb6eb7d7-1462-435c-b651-7c28fbc5256c\" (UID: \"cb6eb7d7-1462-435c-b651-7c28fbc5256c\") " Sep 30 14:00:25 crc kubenswrapper[4936]: I0930 14:00:25.489668 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2t48\" (UniqueName: \"kubernetes.io/projected/cb6eb7d7-1462-435c-b651-7c28fbc5256c-kube-api-access-g2t48\") pod \"cb6eb7d7-1462-435c-b651-7c28fbc5256c\" (UID: \"cb6eb7d7-1462-435c-b651-7c28fbc5256c\") " Sep 30 14:00:25 crc kubenswrapper[4936]: I0930 14:00:25.489748 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6eb7d7-1462-435c-b651-7c28fbc5256c-combined-ca-bundle\") pod \"cb6eb7d7-1462-435c-b651-7c28fbc5256c\" (UID: \"cb6eb7d7-1462-435c-b651-7c28fbc5256c\") " Sep 30 14:00:25 crc kubenswrapper[4936]: I0930 14:00:25.506179 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb6eb7d7-1462-435c-b651-7c28fbc5256c-scripts" (OuterVolumeSpecName: "scripts") pod "cb6eb7d7-1462-435c-b651-7c28fbc5256c" (UID: "cb6eb7d7-1462-435c-b651-7c28fbc5256c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:25 crc kubenswrapper[4936]: I0930 14:00:25.506949 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb6eb7d7-1462-435c-b651-7c28fbc5256c-kube-api-access-g2t48" (OuterVolumeSpecName: "kube-api-access-g2t48") pod "cb6eb7d7-1462-435c-b651-7c28fbc5256c" (UID: "cb6eb7d7-1462-435c-b651-7c28fbc5256c"). InnerVolumeSpecName "kube-api-access-g2t48". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:00:25 crc kubenswrapper[4936]: I0930 14:00:25.523128 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb6eb7d7-1462-435c-b651-7c28fbc5256c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb6eb7d7-1462-435c-b651-7c28fbc5256c" (UID: "cb6eb7d7-1462-435c-b651-7c28fbc5256c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:25 crc kubenswrapper[4936]: I0930 14:00:25.523680 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb6eb7d7-1462-435c-b651-7c28fbc5256c-config-data" (OuterVolumeSpecName: "config-data") pod "cb6eb7d7-1462-435c-b651-7c28fbc5256c" (UID: "cb6eb7d7-1462-435c-b651-7c28fbc5256c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:25 crc kubenswrapper[4936]: I0930 14:00:25.591745 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb6eb7d7-1462-435c-b651-7c28fbc5256c-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:25 crc kubenswrapper[4936]: I0930 14:00:25.591776 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb6eb7d7-1462-435c-b651-7c28fbc5256c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:25 crc kubenswrapper[4936]: I0930 14:00:25.591787 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2t48\" (UniqueName: \"kubernetes.io/projected/cb6eb7d7-1462-435c-b651-7c28fbc5256c-kube-api-access-g2t48\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:25 crc kubenswrapper[4936]: I0930 14:00:25.591797 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6eb7d7-1462-435c-b651-7c28fbc5256c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:25 crc kubenswrapper[4936]: I0930 14:00:25.881982 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5npq5" event={"ID":"cb6eb7d7-1462-435c-b651-7c28fbc5256c","Type":"ContainerDied","Data":"ab7b8f8123052b31ea0edbce5e08383272da4e532897809a458f21d98708f79d"} Sep 30 14:00:25 crc kubenswrapper[4936]: I0930 14:00:25.882246 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab7b8f8123052b31ea0edbce5e08383272da4e532897809a458f21d98708f79d" Sep 30 14:00:25 crc kubenswrapper[4936]: I0930 14:00:25.882058 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5npq5" Sep 30 14:00:26 crc kubenswrapper[4936]: I0930 14:00:26.069996 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:00:26 crc kubenswrapper[4936]: I0930 14:00:26.070457 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b536127c-3a9c-40ce-9c7f-45485f8ba076" containerName="nova-api-log" containerID="cri-o://09870d53d285f4d721875d5a7c36fc0fe8018459fcd8225c0982260a1139bfe6" gracePeriod=30 Sep 30 14:00:26 crc kubenswrapper[4936]: I0930 14:00:26.070541 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b536127c-3a9c-40ce-9c7f-45485f8ba076" containerName="nova-api-api" containerID="cri-o://6bf24c92495a136fb8543b9120654133e2f8eb9c2465872dfc3240f3c76b9ad9" gracePeriod=30 Sep 30 14:00:26 crc kubenswrapper[4936]: I0930 14:00:26.110886 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:00:26 crc kubenswrapper[4936]: I0930 14:00:26.111206 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f1f27948-4455-4a0f-9d53-eadc3e0cb80d" containerName="nova-scheduler-scheduler" containerID="cri-o://df779f5d1064456c9f73325a729cd0aee2c7854ead78629d9d877c317368758d" gracePeriod=30 Sep 30 14:00:26 crc kubenswrapper[4936]: I0930 14:00:26.120217 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:00:26 crc kubenswrapper[4936]: I0930 14:00:26.120525 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="438cb977-993f-4bd0-b3d4-a1b81ecbadff" containerName="nova-metadata-log" containerID="cri-o://a8e546b6492c845d35fc4ff5a0ab574b0e618bf6a29013e0d9d7d707f008bf67" gracePeriod=30 Sep 30 14:00:26 crc kubenswrapper[4936]: I0930 14:00:26.120610 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="438cb977-993f-4bd0-b3d4-a1b81ecbadff" containerName="nova-metadata-metadata" containerID="cri-o://582bfbfb80aa86b162217be00f0e4259b15c0f020f71ee10b638e3179e341b18" gracePeriod=30 Sep 30 14:00:26 crc kubenswrapper[4936]: I0930 14:00:26.892102 4936 generic.go:334] "Generic (PLEG): container finished" podID="b536127c-3a9c-40ce-9c7f-45485f8ba076" containerID="09870d53d285f4d721875d5a7c36fc0fe8018459fcd8225c0982260a1139bfe6" exitCode=143 Sep 30 14:00:26 crc kubenswrapper[4936]: I0930 14:00:26.892156 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b536127c-3a9c-40ce-9c7f-45485f8ba076","Type":"ContainerDied","Data":"09870d53d285f4d721875d5a7c36fc0fe8018459fcd8225c0982260a1139bfe6"} Sep 30 14:00:26 crc kubenswrapper[4936]: I0930 14:00:26.894041 4936 generic.go:334] "Generic (PLEG): container finished" podID="438cb977-993f-4bd0-b3d4-a1b81ecbadff" containerID="a8e546b6492c845d35fc4ff5a0ab574b0e618bf6a29013e0d9d7d707f008bf67" exitCode=143 Sep 30 14:00:26 crc kubenswrapper[4936]: I0930 14:00:26.894070 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"438cb977-993f-4bd0-b3d4-a1b81ecbadff","Type":"ContainerDied","Data":"a8e546b6492c845d35fc4ff5a0ab574b0e618bf6a29013e0d9d7d707f008bf67"} Sep 30 14:00:27 crc kubenswrapper[4936]: I0930 14:00:27.905564 4936 generic.go:334] "Generic (PLEG): container finished" podID="b536127c-3a9c-40ce-9c7f-45485f8ba076" containerID="6bf24c92495a136fb8543b9120654133e2f8eb9c2465872dfc3240f3c76b9ad9" exitCode=0 Sep 30 14:00:27 crc kubenswrapper[4936]: I0930 14:00:27.906596 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b536127c-3a9c-40ce-9c7f-45485f8ba076","Type":"ContainerDied","Data":"6bf24c92495a136fb8543b9120654133e2f8eb9c2465872dfc3240f3c76b9ad9"} Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.012862 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.144271 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b536127c-3a9c-40ce-9c7f-45485f8ba076-logs\") pod \"b536127c-3a9c-40ce-9c7f-45485f8ba076\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.144700 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-internal-tls-certs\") pod \"b536127c-3a9c-40ce-9c7f-45485f8ba076\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.144742 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xqrt\" (UniqueName: \"kubernetes.io/projected/b536127c-3a9c-40ce-9c7f-45485f8ba076-kube-api-access-9xqrt\") pod \"b536127c-3a9c-40ce-9c7f-45485f8ba076\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.144759 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-public-tls-certs\") pod \"b536127c-3a9c-40ce-9c7f-45485f8ba076\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.144803 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-combined-ca-bundle\") pod \"b536127c-3a9c-40ce-9c7f-45485f8ba076\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.144808 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b536127c-3a9c-40ce-9c7f-45485f8ba076-logs" (OuterVolumeSpecName: "logs") pod "b536127c-3a9c-40ce-9c7f-45485f8ba076" (UID: "b536127c-3a9c-40ce-9c7f-45485f8ba076"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.144821 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-config-data\") pod \"b536127c-3a9c-40ce-9c7f-45485f8ba076\" (UID: \"b536127c-3a9c-40ce-9c7f-45485f8ba076\") " Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.145239 4936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b536127c-3a9c-40ce-9c7f-45485f8ba076-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.150076 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b536127c-3a9c-40ce-9c7f-45485f8ba076-kube-api-access-9xqrt" (OuterVolumeSpecName: "kube-api-access-9xqrt") pod "b536127c-3a9c-40ce-9c7f-45485f8ba076" (UID: "b536127c-3a9c-40ce-9c7f-45485f8ba076"). InnerVolumeSpecName "kube-api-access-9xqrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.183372 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b536127c-3a9c-40ce-9c7f-45485f8ba076" (UID: "b536127c-3a9c-40ce-9c7f-45485f8ba076"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.186275 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-config-data" (OuterVolumeSpecName: "config-data") pod "b536127c-3a9c-40ce-9c7f-45485f8ba076" (UID: "b536127c-3a9c-40ce-9c7f-45485f8ba076"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.192969 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b536127c-3a9c-40ce-9c7f-45485f8ba076" (UID: "b536127c-3a9c-40ce-9c7f-45485f8ba076"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.200001 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b536127c-3a9c-40ce-9c7f-45485f8ba076" (UID: "b536127c-3a9c-40ce-9c7f-45485f8ba076"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.246464 4936 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.246515 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xqrt\" (UniqueName: \"kubernetes.io/projected/b536127c-3a9c-40ce-9c7f-45485f8ba076-kube-api-access-9xqrt\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.246528 4936 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.246540 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.246548 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b536127c-3a9c-40ce-9c7f-45485f8ba076-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.919704 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b536127c-3a9c-40ce-9c7f-45485f8ba076","Type":"ContainerDied","Data":"fd1d69135ad076813e37857b349a4c155cc8f48c55ac8d198e0a9268be67d15e"} Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.919990 4936 scope.go:117] "RemoveContainer" containerID="6bf24c92495a136fb8543b9120654133e2f8eb9c2465872dfc3240f3c76b9ad9" Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.920529 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.921103 4936 generic.go:334] "Generic (PLEG): container finished" podID="f1f27948-4455-4a0f-9d53-eadc3e0cb80d" containerID="df779f5d1064456c9f73325a729cd0aee2c7854ead78629d9d877c317368758d" exitCode=0 Sep 30 14:00:28 crc kubenswrapper[4936]: I0930 14:00:28.921129 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f1f27948-4455-4a0f-9d53-eadc3e0cb80d","Type":"ContainerDied","Data":"df779f5d1064456c9f73325a729cd0aee2c7854ead78629d9d877c317368758d"} Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.008448 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.021239 4936 scope.go:117] "RemoveContainer" containerID="09870d53d285f4d721875d5a7c36fc0fe8018459fcd8225c0982260a1139bfe6" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.027443 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.037789 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.070491 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 14:00:29 crc kubenswrapper[4936]: E0930 14:00:29.070983 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0b046e-7c77-461b-ad62-1ece23a1225c" containerName="init" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.071005 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0b046e-7c77-461b-ad62-1ece23a1225c" containerName="init" Sep 30 14:00:29 crc kubenswrapper[4936]: E0930 14:00:29.071028 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0b046e-7c77-461b-ad62-1ece23a1225c" containerName="dnsmasq-dns" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.071035 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0b046e-7c77-461b-ad62-1ece23a1225c" containerName="dnsmasq-dns" Sep 30 14:00:29 crc kubenswrapper[4936]: E0930 14:00:29.071053 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b536127c-3a9c-40ce-9c7f-45485f8ba076" containerName="nova-api-log" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.071060 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b536127c-3a9c-40ce-9c7f-45485f8ba076" containerName="nova-api-log" Sep 30 14:00:29 crc kubenswrapper[4936]: E0930 14:00:29.071082 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b536127c-3a9c-40ce-9c7f-45485f8ba076" containerName="nova-api-api" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.071089 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b536127c-3a9c-40ce-9c7f-45485f8ba076" containerName="nova-api-api" Sep 30 14:00:29 crc kubenswrapper[4936]: E0930 14:00:29.071109 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb6eb7d7-1462-435c-b651-7c28fbc5256c" containerName="nova-manage" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.071116 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb6eb7d7-1462-435c-b651-7c28fbc5256c" containerName="nova-manage" Sep 30 14:00:29 crc kubenswrapper[4936]: E0930 14:00:29.071130 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f27948-4455-4a0f-9d53-eadc3e0cb80d" containerName="nova-scheduler-scheduler" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.071137 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f27948-4455-4a0f-9d53-eadc3e0cb80d" containerName="nova-scheduler-scheduler" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.071380 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b536127c-3a9c-40ce-9c7f-45485f8ba076" containerName="nova-api-api" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.071400 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb6eb7d7-1462-435c-b651-7c28fbc5256c" containerName="nova-manage" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.071409 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f27948-4455-4a0f-9d53-eadc3e0cb80d" containerName="nova-scheduler-scheduler" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.071417 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b536127c-3a9c-40ce-9c7f-45485f8ba076" containerName="nova-api-log" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.071432 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce0b046e-7c77-461b-ad62-1ece23a1225c" containerName="dnsmasq-dns" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.072640 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.077772 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.078000 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.078282 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.082788 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.164130 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f27948-4455-4a0f-9d53-eadc3e0cb80d-combined-ca-bundle\") pod \"f1f27948-4455-4a0f-9d53-eadc3e0cb80d\" (UID: \"f1f27948-4455-4a0f-9d53-eadc3e0cb80d\") " Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.164449 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggff5\" (UniqueName: \"kubernetes.io/projected/f1f27948-4455-4a0f-9d53-eadc3e0cb80d-kube-api-access-ggff5\") pod \"f1f27948-4455-4a0f-9d53-eadc3e0cb80d\" (UID: \"f1f27948-4455-4a0f-9d53-eadc3e0cb80d\") " Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.164661 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f27948-4455-4a0f-9d53-eadc3e0cb80d-config-data\") pod \"f1f27948-4455-4a0f-9d53-eadc3e0cb80d\" (UID: \"f1f27948-4455-4a0f-9d53-eadc3e0cb80d\") " Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.187552 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1f27948-4455-4a0f-9d53-eadc3e0cb80d-kube-api-access-ggff5" (OuterVolumeSpecName: "kube-api-access-ggff5") pod "f1f27948-4455-4a0f-9d53-eadc3e0cb80d" (UID: "f1f27948-4455-4a0f-9d53-eadc3e0cb80d"). InnerVolumeSpecName "kube-api-access-ggff5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.191184 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f27948-4455-4a0f-9d53-eadc3e0cb80d-config-data" (OuterVolumeSpecName: "config-data") pod "f1f27948-4455-4a0f-9d53-eadc3e0cb80d" (UID: "f1f27948-4455-4a0f-9d53-eadc3e0cb80d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.218057 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f27948-4455-4a0f-9d53-eadc3e0cb80d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1f27948-4455-4a0f-9d53-eadc3e0cb80d" (UID: "f1f27948-4455-4a0f-9d53-eadc3e0cb80d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.267038 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78596c14-d0a4-444c-8096-962a9359418a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"78596c14-d0a4-444c-8096-962a9359418a\") " pod="openstack/nova-api-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.267111 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78596c14-d0a4-444c-8096-962a9359418a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"78596c14-d0a4-444c-8096-962a9359418a\") " pod="openstack/nova-api-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.267180 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78596c14-d0a4-444c-8096-962a9359418a-config-data\") pod \"nova-api-0\" (UID: \"78596c14-d0a4-444c-8096-962a9359418a\") " pod="openstack/nova-api-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.267277 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78596c14-d0a4-444c-8096-962a9359418a-public-tls-certs\") pod \"nova-api-0\" (UID: \"78596c14-d0a4-444c-8096-962a9359418a\") " pod="openstack/nova-api-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.267305 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78596c14-d0a4-444c-8096-962a9359418a-logs\") pod \"nova-api-0\" (UID: \"78596c14-d0a4-444c-8096-962a9359418a\") " pod="openstack/nova-api-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.267325 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9lmq\" (UniqueName: \"kubernetes.io/projected/78596c14-d0a4-444c-8096-962a9359418a-kube-api-access-m9lmq\") pod \"nova-api-0\" (UID: \"78596c14-d0a4-444c-8096-962a9359418a\") " pod="openstack/nova-api-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.267447 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f27948-4455-4a0f-9d53-eadc3e0cb80d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.267457 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggff5\" (UniqueName: \"kubernetes.io/projected/f1f27948-4455-4a0f-9d53-eadc3e0cb80d-kube-api-access-ggff5\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.267468 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f27948-4455-4a0f-9d53-eadc3e0cb80d-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.368746 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78596c14-d0a4-444c-8096-962a9359418a-logs\") pod \"nova-api-0\" (UID: \"78596c14-d0a4-444c-8096-962a9359418a\") " pod="openstack/nova-api-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.369044 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9lmq\" (UniqueName: \"kubernetes.io/projected/78596c14-d0a4-444c-8096-962a9359418a-kube-api-access-m9lmq\") pod \"nova-api-0\" (UID: \"78596c14-d0a4-444c-8096-962a9359418a\") " pod="openstack/nova-api-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.369188 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78596c14-d0a4-444c-8096-962a9359418a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"78596c14-d0a4-444c-8096-962a9359418a\") " pod="openstack/nova-api-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.369273 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78596c14-d0a4-444c-8096-962a9359418a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"78596c14-d0a4-444c-8096-962a9359418a\") " pod="openstack/nova-api-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.369391 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78596c14-d0a4-444c-8096-962a9359418a-config-data\") pod \"nova-api-0\" (UID: \"78596c14-d0a4-444c-8096-962a9359418a\") " pod="openstack/nova-api-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.369538 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78596c14-d0a4-444c-8096-962a9359418a-public-tls-certs\") pod \"nova-api-0\" (UID: \"78596c14-d0a4-444c-8096-962a9359418a\") " pod="openstack/nova-api-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.369237 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78596c14-d0a4-444c-8096-962a9359418a-logs\") pod \"nova-api-0\" (UID: \"78596c14-d0a4-444c-8096-962a9359418a\") " pod="openstack/nova-api-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.372814 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78596c14-d0a4-444c-8096-962a9359418a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"78596c14-d0a4-444c-8096-962a9359418a\") " pod="openstack/nova-api-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.372905 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78596c14-d0a4-444c-8096-962a9359418a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"78596c14-d0a4-444c-8096-962a9359418a\") " pod="openstack/nova-api-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.373318 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78596c14-d0a4-444c-8096-962a9359418a-public-tls-certs\") pod \"nova-api-0\" (UID: \"78596c14-d0a4-444c-8096-962a9359418a\") " pod="openstack/nova-api-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.376944 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78596c14-d0a4-444c-8096-962a9359418a-config-data\") pod \"nova-api-0\" (UID: \"78596c14-d0a4-444c-8096-962a9359418a\") " pod="openstack/nova-api-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.388571 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9lmq\" (UniqueName: \"kubernetes.io/projected/78596c14-d0a4-444c-8096-962a9359418a-kube-api-access-m9lmq\") pod \"nova-api-0\" (UID: \"78596c14-d0a4-444c-8096-962a9359418a\") " pod="openstack/nova-api-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.393100 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.878471 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:00:29 crc kubenswrapper[4936]: W0930 14:00:29.918634 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78596c14_d0a4_444c_8096_962a9359418a.slice/crio-695454d88f9754ee4f141461d95f002872076041a247b27bbd2c9067b689a5ca WatchSource:0}: Error finding container 695454d88f9754ee4f141461d95f002872076041a247b27bbd2c9067b689a5ca: Status 404 returned error can't find the container with id 695454d88f9754ee4f141461d95f002872076041a247b27bbd2c9067b689a5ca Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.941281 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f1f27948-4455-4a0f-9d53-eadc3e0cb80d","Type":"ContainerDied","Data":"95459dc27c1557aa908e6aabbfb663cd7899fb5e6a634fa236e0843e38b75eb8"} Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.941653 4936 scope.go:117] "RemoveContainer" containerID="df779f5d1064456c9f73325a729cd0aee2c7854ead78629d9d877c317368758d" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.941492 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 14:00:29 crc kubenswrapper[4936]: I0930 14:00:29.944039 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78596c14-d0a4-444c-8096-962a9359418a","Type":"ContainerStarted","Data":"695454d88f9754ee4f141461d95f002872076041a247b27bbd2c9067b689a5ca"} Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.097366 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.115367 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.149850 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.151359 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.161593 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.170908 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.290953 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9f49\" (UniqueName: \"kubernetes.io/projected/44f3fb8e-7a0f-4e09-877d-eb823eac2b78-kube-api-access-d9f49\") pod \"nova-scheduler-0\" (UID: \"44f3fb8e-7a0f-4e09-877d-eb823eac2b78\") " pod="openstack/nova-scheduler-0" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.291488 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44f3fb8e-7a0f-4e09-877d-eb823eac2b78-config-data\") pod \"nova-scheduler-0\" (UID: \"44f3fb8e-7a0f-4e09-877d-eb823eac2b78\") " pod="openstack/nova-scheduler-0" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.291527 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f3fb8e-7a0f-4e09-877d-eb823eac2b78-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"44f3fb8e-7a0f-4e09-877d-eb823eac2b78\") " pod="openstack/nova-scheduler-0" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.375033 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b536127c-3a9c-40ce-9c7f-45485f8ba076" path="/var/lib/kubelet/pods/b536127c-3a9c-40ce-9c7f-45485f8ba076/volumes" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.376195 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1f27948-4455-4a0f-9d53-eadc3e0cb80d" path="/var/lib/kubelet/pods/f1f27948-4455-4a0f-9d53-eadc3e0cb80d/volumes" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.392996 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44f3fb8e-7a0f-4e09-877d-eb823eac2b78-config-data\") pod \"nova-scheduler-0\" (UID: \"44f3fb8e-7a0f-4e09-877d-eb823eac2b78\") " pod="openstack/nova-scheduler-0" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.393125 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f3fb8e-7a0f-4e09-877d-eb823eac2b78-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"44f3fb8e-7a0f-4e09-877d-eb823eac2b78\") " pod="openstack/nova-scheduler-0" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.393248 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9f49\" (UniqueName: \"kubernetes.io/projected/44f3fb8e-7a0f-4e09-877d-eb823eac2b78-kube-api-access-d9f49\") pod \"nova-scheduler-0\" (UID: \"44f3fb8e-7a0f-4e09-877d-eb823eac2b78\") " pod="openstack/nova-scheduler-0" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.408611 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44f3fb8e-7a0f-4e09-877d-eb823eac2b78-config-data\") pod \"nova-scheduler-0\" (UID: \"44f3fb8e-7a0f-4e09-877d-eb823eac2b78\") " pod="openstack/nova-scheduler-0" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.411583 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f3fb8e-7a0f-4e09-877d-eb823eac2b78-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"44f3fb8e-7a0f-4e09-877d-eb823eac2b78\") " pod="openstack/nova-scheduler-0" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.415066 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9f49\" (UniqueName: \"kubernetes.io/projected/44f3fb8e-7a0f-4e09-877d-eb823eac2b78-kube-api-access-d9f49\") pod \"nova-scheduler-0\" (UID: \"44f3fb8e-7a0f-4e09-877d-eb823eac2b78\") " pod="openstack/nova-scheduler-0" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.468702 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.471932 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.596153 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45mfv\" (UniqueName: \"kubernetes.io/projected/438cb977-993f-4bd0-b3d4-a1b81ecbadff-kube-api-access-45mfv\") pod \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\" (UID: \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\") " Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.596217 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438cb977-993f-4bd0-b3d4-a1b81ecbadff-config-data\") pod \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\" (UID: \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\") " Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.596262 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/438cb977-993f-4bd0-b3d4-a1b81ecbadff-nova-metadata-tls-certs\") pod \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\" (UID: \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\") " Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.596348 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/438cb977-993f-4bd0-b3d4-a1b81ecbadff-logs\") pod \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\" (UID: \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\") " Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.596395 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438cb977-993f-4bd0-b3d4-a1b81ecbadff-combined-ca-bundle\") pod \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\" (UID: \"438cb977-993f-4bd0-b3d4-a1b81ecbadff\") " Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.618506 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/438cb977-993f-4bd0-b3d4-a1b81ecbadff-logs" (OuterVolumeSpecName: "logs") pod "438cb977-993f-4bd0-b3d4-a1b81ecbadff" (UID: "438cb977-993f-4bd0-b3d4-a1b81ecbadff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.618700 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/438cb977-993f-4bd0-b3d4-a1b81ecbadff-kube-api-access-45mfv" (OuterVolumeSpecName: "kube-api-access-45mfv") pod "438cb977-993f-4bd0-b3d4-a1b81ecbadff" (UID: "438cb977-993f-4bd0-b3d4-a1b81ecbadff"). InnerVolumeSpecName "kube-api-access-45mfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.635573 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438cb977-993f-4bd0-b3d4-a1b81ecbadff-config-data" (OuterVolumeSpecName: "config-data") pod "438cb977-993f-4bd0-b3d4-a1b81ecbadff" (UID: "438cb977-993f-4bd0-b3d4-a1b81ecbadff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.656152 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438cb977-993f-4bd0-b3d4-a1b81ecbadff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "438cb977-993f-4bd0-b3d4-a1b81ecbadff" (UID: "438cb977-993f-4bd0-b3d4-a1b81ecbadff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.708860 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438cb977-993f-4bd0-b3d4-a1b81ecbadff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.708889 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45mfv\" (UniqueName: \"kubernetes.io/projected/438cb977-993f-4bd0-b3d4-a1b81ecbadff-kube-api-access-45mfv\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.708902 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438cb977-993f-4bd0-b3d4-a1b81ecbadff-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.708911 4936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/438cb977-993f-4bd0-b3d4-a1b81ecbadff-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.725382 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438cb977-993f-4bd0-b3d4-a1b81ecbadff-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "438cb977-993f-4bd0-b3d4-a1b81ecbadff" (UID: "438cb977-993f-4bd0-b3d4-a1b81ecbadff"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.811689 4936 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/438cb977-993f-4bd0-b3d4-a1b81ecbadff-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.958957 4936 generic.go:334] "Generic (PLEG): container finished" podID="438cb977-993f-4bd0-b3d4-a1b81ecbadff" containerID="582bfbfb80aa86b162217be00f0e4259b15c0f020f71ee10b638e3179e341b18" exitCode=0 Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.959011 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"438cb977-993f-4bd0-b3d4-a1b81ecbadff","Type":"ContainerDied","Data":"582bfbfb80aa86b162217be00f0e4259b15c0f020f71ee10b638e3179e341b18"} Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.959038 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"438cb977-993f-4bd0-b3d4-a1b81ecbadff","Type":"ContainerDied","Data":"533015542cdc636a99c79aa08b0039e0a5a7e9fe8c2ebd8cb11864771e5567ef"} Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.959057 4936 scope.go:117] "RemoveContainer" containerID="582bfbfb80aa86b162217be00f0e4259b15c0f020f71ee10b638e3179e341b18" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.959160 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.968800 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78596c14-d0a4-444c-8096-962a9359418a","Type":"ContainerStarted","Data":"498c3eee17ca3d9661c1cbdc29286c9d476f8c195fb9d279de52e7f7a6945449"} Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.968838 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78596c14-d0a4-444c-8096-962a9359418a","Type":"ContainerStarted","Data":"4843ca1e0476b4a6683a5d0cbd24cf594438ee16b0b53a2fc16d93eb50c0e95a"} Sep 30 14:00:30 crc kubenswrapper[4936]: I0930 14:00:30.992557 4936 scope.go:117] "RemoveContainer" containerID="a8e546b6492c845d35fc4ff5a0ab574b0e618bf6a29013e0d9d7d707f008bf67" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.005519 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.005496735 podStartE2EDuration="2.005496735s" podCreationTimestamp="2025-09-30 14:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:30.998053271 +0000 UTC m=+1281.382055572" watchObservedRunningTime="2025-09-30 14:00:31.005496735 +0000 UTC m=+1281.389499036" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.049174 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.060585 4936 scope.go:117] "RemoveContainer" containerID="582bfbfb80aa86b162217be00f0e4259b15c0f020f71ee10b638e3179e341b18" Sep 30 14:00:31 crc kubenswrapper[4936]: E0930 14:00:31.065425 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"582bfbfb80aa86b162217be00f0e4259b15c0f020f71ee10b638e3179e341b18\": container with ID starting with 582bfbfb80aa86b162217be00f0e4259b15c0f020f71ee10b638e3179e341b18 not found: ID does not exist" containerID="582bfbfb80aa86b162217be00f0e4259b15c0f020f71ee10b638e3179e341b18" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.065463 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"582bfbfb80aa86b162217be00f0e4259b15c0f020f71ee10b638e3179e341b18"} err="failed to get container status \"582bfbfb80aa86b162217be00f0e4259b15c0f020f71ee10b638e3179e341b18\": rpc error: code = NotFound desc = could not find container \"582bfbfb80aa86b162217be00f0e4259b15c0f020f71ee10b638e3179e341b18\": container with ID starting with 582bfbfb80aa86b162217be00f0e4259b15c0f020f71ee10b638e3179e341b18 not found: ID does not exist" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.065488 4936 scope.go:117] "RemoveContainer" containerID="a8e546b6492c845d35fc4ff5a0ab574b0e618bf6a29013e0d9d7d707f008bf67" Sep 30 14:00:31 crc kubenswrapper[4936]: E0930 14:00:31.069156 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8e546b6492c845d35fc4ff5a0ab574b0e618bf6a29013e0d9d7d707f008bf67\": container with ID starting with a8e546b6492c845d35fc4ff5a0ab574b0e618bf6a29013e0d9d7d707f008bf67 not found: ID does not exist" containerID="a8e546b6492c845d35fc4ff5a0ab574b0e618bf6a29013e0d9d7d707f008bf67" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.069193 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e546b6492c845d35fc4ff5a0ab574b0e618bf6a29013e0d9d7d707f008bf67"} err="failed to get container status \"a8e546b6492c845d35fc4ff5a0ab574b0e618bf6a29013e0d9d7d707f008bf67\": rpc error: code = NotFound desc = could not find container \"a8e546b6492c845d35fc4ff5a0ab574b0e618bf6a29013e0d9d7d707f008bf67\": container with ID starting with a8e546b6492c845d35fc4ff5a0ab574b0e618bf6a29013e0d9d7d707f008bf67 not found: ID does not exist" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.069649 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.080469 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:00:31 crc kubenswrapper[4936]: E0930 14:00:31.081081 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438cb977-993f-4bd0-b3d4-a1b81ecbadff" containerName="nova-metadata-log" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.081097 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="438cb977-993f-4bd0-b3d4-a1b81ecbadff" containerName="nova-metadata-log" Sep 30 14:00:31 crc kubenswrapper[4936]: E0930 14:00:31.081111 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438cb977-993f-4bd0-b3d4-a1b81ecbadff" containerName="nova-metadata-metadata" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.081118 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="438cb977-993f-4bd0-b3d4-a1b81ecbadff" containerName="nova-metadata-metadata" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.082753 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="438cb977-993f-4bd0-b3d4-a1b81ecbadff" containerName="nova-metadata-log" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.082790 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="438cb977-993f-4bd0-b3d4-a1b81ecbadff" containerName="nova-metadata-metadata" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.083994 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.092132 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.092513 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.106605 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.126159 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:00:31 crc kubenswrapper[4936]: E0930 14:00:31.220007 4936 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod438cb977_993f_4bd0_b3d4_a1b81ecbadff.slice/crio-533015542cdc636a99c79aa08b0039e0a5a7e9fe8c2ebd8cb11864771e5567ef\": RecentStats: unable to find data in memory cache]" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.220644 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5e4f5cf-48f7-4f5a-a503-cd4d57174087-logs\") pod \"nova-metadata-0\" (UID: \"f5e4f5cf-48f7-4f5a-a503-cd4d57174087\") " pod="openstack/nova-metadata-0" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.220692 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5e4f5cf-48f7-4f5a-a503-cd4d57174087-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f5e4f5cf-48f7-4f5a-a503-cd4d57174087\") " pod="openstack/nova-metadata-0" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.220775 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5e4f5cf-48f7-4f5a-a503-cd4d57174087-config-data\") pod \"nova-metadata-0\" (UID: \"f5e4f5cf-48f7-4f5a-a503-cd4d57174087\") " pod="openstack/nova-metadata-0" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.220807 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhbnj\" (UniqueName: \"kubernetes.io/projected/f5e4f5cf-48f7-4f5a-a503-cd4d57174087-kube-api-access-zhbnj\") pod \"nova-metadata-0\" (UID: \"f5e4f5cf-48f7-4f5a-a503-cd4d57174087\") " pod="openstack/nova-metadata-0" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.220838 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5e4f5cf-48f7-4f5a-a503-cd4d57174087-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f5e4f5cf-48f7-4f5a-a503-cd4d57174087\") " pod="openstack/nova-metadata-0" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.321936 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5e4f5cf-48f7-4f5a-a503-cd4d57174087-config-data\") pod \"nova-metadata-0\" (UID: \"f5e4f5cf-48f7-4f5a-a503-cd4d57174087\") " pod="openstack/nova-metadata-0" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.322008 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhbnj\" (UniqueName: \"kubernetes.io/projected/f5e4f5cf-48f7-4f5a-a503-cd4d57174087-kube-api-access-zhbnj\") pod \"nova-metadata-0\" (UID: \"f5e4f5cf-48f7-4f5a-a503-cd4d57174087\") " pod="openstack/nova-metadata-0" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.322064 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5e4f5cf-48f7-4f5a-a503-cd4d57174087-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f5e4f5cf-48f7-4f5a-a503-cd4d57174087\") " pod="openstack/nova-metadata-0" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.322113 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5e4f5cf-48f7-4f5a-a503-cd4d57174087-logs\") pod \"nova-metadata-0\" (UID: \"f5e4f5cf-48f7-4f5a-a503-cd4d57174087\") " pod="openstack/nova-metadata-0" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.322145 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5e4f5cf-48f7-4f5a-a503-cd4d57174087-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f5e4f5cf-48f7-4f5a-a503-cd4d57174087\") " pod="openstack/nova-metadata-0" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.322720 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5e4f5cf-48f7-4f5a-a503-cd4d57174087-logs\") pod \"nova-metadata-0\" (UID: \"f5e4f5cf-48f7-4f5a-a503-cd4d57174087\") " pod="openstack/nova-metadata-0" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.325984 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5e4f5cf-48f7-4f5a-a503-cd4d57174087-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f5e4f5cf-48f7-4f5a-a503-cd4d57174087\") " pod="openstack/nova-metadata-0" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.326303 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5e4f5cf-48f7-4f5a-a503-cd4d57174087-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f5e4f5cf-48f7-4f5a-a503-cd4d57174087\") " pod="openstack/nova-metadata-0" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.326414 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5e4f5cf-48f7-4f5a-a503-cd4d57174087-config-data\") pod \"nova-metadata-0\" (UID: \"f5e4f5cf-48f7-4f5a-a503-cd4d57174087\") " pod="openstack/nova-metadata-0" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.340293 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhbnj\" (UniqueName: \"kubernetes.io/projected/f5e4f5cf-48f7-4f5a-a503-cd4d57174087-kube-api-access-zhbnj\") pod \"nova-metadata-0\" (UID: \"f5e4f5cf-48f7-4f5a-a503-cd4d57174087\") " pod="openstack/nova-metadata-0" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.561881 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.980857 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"44f3fb8e-7a0f-4e09-877d-eb823eac2b78","Type":"ContainerStarted","Data":"ef930b1c43d55b7bd26fafad8a4ce31eef5d9dedb468b2fd5511fb804f503988"} Sep 30 14:00:31 crc kubenswrapper[4936]: I0930 14:00:31.981314 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"44f3fb8e-7a0f-4e09-877d-eb823eac2b78","Type":"ContainerStarted","Data":"e540806eecfb70f57030c522eac46ebd29d4a7ee98d256993b7190ee0f2011e9"} Sep 30 14:00:32 crc kubenswrapper[4936]: I0930 14:00:32.008379 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.008355348 podStartE2EDuration="2.008355348s" podCreationTimestamp="2025-09-30 14:00:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:32.002821477 +0000 UTC m=+1282.386823778" watchObservedRunningTime="2025-09-30 14:00:32.008355348 +0000 UTC m=+1282.392357659" Sep 30 14:00:32 crc kubenswrapper[4936]: W0930 14:00:32.039487 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5e4f5cf_48f7_4f5a_a503_cd4d57174087.slice/crio-2cb8c3d6a5b5b77db605d6a24725cffda465dbb32a9865cf336e23ea1007627a WatchSource:0}: Error finding container 2cb8c3d6a5b5b77db605d6a24725cffda465dbb32a9865cf336e23ea1007627a: Status 404 returned error can't find the container with id 2cb8c3d6a5b5b77db605d6a24725cffda465dbb32a9865cf336e23ea1007627a Sep 30 14:00:32 crc kubenswrapper[4936]: I0930 14:00:32.055506 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:00:32 crc kubenswrapper[4936]: I0930 14:00:32.329453 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="438cb977-993f-4bd0-b3d4-a1b81ecbadff" path="/var/lib/kubelet/pods/438cb977-993f-4bd0-b3d4-a1b81ecbadff/volumes" Sep 30 14:00:32 crc kubenswrapper[4936]: I0930 14:00:32.993704 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f5e4f5cf-48f7-4f5a-a503-cd4d57174087","Type":"ContainerStarted","Data":"2e5b0cc473e4b7dc29c62be71e25aee1d27642fedf2ab63eb7e85b14ef2d1a39"} Sep 30 14:00:32 crc kubenswrapper[4936]: I0930 14:00:32.994068 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f5e4f5cf-48f7-4f5a-a503-cd4d57174087","Type":"ContainerStarted","Data":"a96973ea4d2e43fadc16e935809718736895335f2148eb15f5609e83ac2bd134"} Sep 30 14:00:32 crc kubenswrapper[4936]: I0930 14:00:32.994084 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f5e4f5cf-48f7-4f5a-a503-cd4d57174087","Type":"ContainerStarted","Data":"2cb8c3d6a5b5b77db605d6a24725cffda465dbb32a9865cf336e23ea1007627a"} Sep 30 14:00:35 crc kubenswrapper[4936]: I0930 14:00:35.472782 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 14:00:36 crc kubenswrapper[4936]: I0930 14:00:36.562809 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 14:00:36 crc kubenswrapper[4936]: I0930 14:00:36.562881 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 14:00:39 crc kubenswrapper[4936]: I0930 14:00:39.394810 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 14:00:39 crc kubenswrapper[4936]: I0930 14:00:39.395216 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 14:00:40 crc kubenswrapper[4936]: I0930 14:00:40.408571 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="78596c14-d0a4-444c-8096-962a9359418a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.187:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 14:00:40 crc kubenswrapper[4936]: I0930 14:00:40.408860 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="78596c14-d0a4-444c-8096-962a9359418a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.187:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 14:00:40 crc kubenswrapper[4936]: I0930 14:00:40.472681 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 14:00:40 crc kubenswrapper[4936]: I0930 14:00:40.502215 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 14:00:40 crc kubenswrapper[4936]: I0930 14:00:40.532794 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=9.532768619 podStartE2EDuration="9.532768619s" podCreationTimestamp="2025-09-30 14:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:33.02292295 +0000 UTC m=+1283.406925261" watchObservedRunningTime="2025-09-30 14:00:40.532768619 +0000 UTC m=+1290.916770920" Sep 30 14:00:41 crc kubenswrapper[4936]: I0930 14:00:41.107920 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 14:00:41 crc kubenswrapper[4936]: I0930 14:00:41.562665 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 14:00:41 crc kubenswrapper[4936]: I0930 14:00:41.562738 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 14:00:42 crc kubenswrapper[4936]: I0930 14:00:42.574551 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f5e4f5cf-48f7-4f5a-a503-cd4d57174087" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 14:00:42 crc kubenswrapper[4936]: I0930 14:00:42.574556 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f5e4f5cf-48f7-4f5a-a503-cd4d57174087" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 14:00:45 crc kubenswrapper[4936]: I0930 14:00:45.053216 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 14:00:48 crc kubenswrapper[4936]: I0930 14:00:48.249648 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:00:48 crc kubenswrapper[4936]: I0930 14:00:48.250508 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:00:49 crc kubenswrapper[4936]: I0930 14:00:49.403653 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 14:00:49 crc kubenswrapper[4936]: I0930 14:00:49.404035 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 14:00:49 crc kubenswrapper[4936]: I0930 14:00:49.405229 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 14:00:49 crc kubenswrapper[4936]: I0930 14:00:49.405270 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 14:00:49 crc kubenswrapper[4936]: I0930 14:00:49.412417 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 14:00:49 crc kubenswrapper[4936]: I0930 14:00:49.417756 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 14:00:51 crc kubenswrapper[4936]: I0930 14:00:51.567151 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 14:00:51 crc kubenswrapper[4936]: I0930 14:00:51.569281 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 14:00:51 crc kubenswrapper[4936]: I0930 14:00:51.572638 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 14:00:52 crc kubenswrapper[4936]: I0930 14:00:52.167536 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 14:01:00 crc kubenswrapper[4936]: I0930 14:01:00.150542 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29320681-l9b2f"] Sep 30 14:01:00 crc kubenswrapper[4936]: I0930 14:01:00.156435 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320681-l9b2f" Sep 30 14:01:00 crc kubenswrapper[4936]: I0930 14:01:00.168492 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320681-l9b2f"] Sep 30 14:01:00 crc kubenswrapper[4936]: I0930 14:01:00.248020 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-fernet-keys\") pod \"keystone-cron-29320681-l9b2f\" (UID: \"9476f1f4-61fa-4d56-a54b-cf28db2e0d47\") " pod="openstack/keystone-cron-29320681-l9b2f" Sep 30 14:01:00 crc kubenswrapper[4936]: I0930 14:01:00.248469 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-config-data\") pod \"keystone-cron-29320681-l9b2f\" (UID: \"9476f1f4-61fa-4d56-a54b-cf28db2e0d47\") " pod="openstack/keystone-cron-29320681-l9b2f" Sep 30 14:01:00 crc kubenswrapper[4936]: I0930 14:01:00.248715 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-combined-ca-bundle\") pod \"keystone-cron-29320681-l9b2f\" (UID: \"9476f1f4-61fa-4d56-a54b-cf28db2e0d47\") " pod="openstack/keystone-cron-29320681-l9b2f" Sep 30 14:01:00 crc kubenswrapper[4936]: I0930 14:01:00.248825 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p84c5\" (UniqueName: \"kubernetes.io/projected/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-kube-api-access-p84c5\") pod \"keystone-cron-29320681-l9b2f\" (UID: \"9476f1f4-61fa-4d56-a54b-cf28db2e0d47\") " pod="openstack/keystone-cron-29320681-l9b2f" Sep 30 14:01:00 crc kubenswrapper[4936]: I0930 14:01:00.350675 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-config-data\") pod \"keystone-cron-29320681-l9b2f\" (UID: \"9476f1f4-61fa-4d56-a54b-cf28db2e0d47\") " pod="openstack/keystone-cron-29320681-l9b2f" Sep 30 14:01:00 crc kubenswrapper[4936]: I0930 14:01:00.350796 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-combined-ca-bundle\") pod \"keystone-cron-29320681-l9b2f\" (UID: \"9476f1f4-61fa-4d56-a54b-cf28db2e0d47\") " pod="openstack/keystone-cron-29320681-l9b2f" Sep 30 14:01:00 crc kubenswrapper[4936]: I0930 14:01:00.350828 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p84c5\" (UniqueName: \"kubernetes.io/projected/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-kube-api-access-p84c5\") pod \"keystone-cron-29320681-l9b2f\" (UID: \"9476f1f4-61fa-4d56-a54b-cf28db2e0d47\") " pod="openstack/keystone-cron-29320681-l9b2f" Sep 30 14:01:00 crc kubenswrapper[4936]: I0930 14:01:00.350857 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-fernet-keys\") pod \"keystone-cron-29320681-l9b2f\" (UID: \"9476f1f4-61fa-4d56-a54b-cf28db2e0d47\") " pod="openstack/keystone-cron-29320681-l9b2f" Sep 30 14:01:00 crc kubenswrapper[4936]: I0930 14:01:00.358536 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-combined-ca-bundle\") pod \"keystone-cron-29320681-l9b2f\" (UID: \"9476f1f4-61fa-4d56-a54b-cf28db2e0d47\") " pod="openstack/keystone-cron-29320681-l9b2f" Sep 30 14:01:00 crc kubenswrapper[4936]: I0930 14:01:00.363451 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-config-data\") pod \"keystone-cron-29320681-l9b2f\" (UID: \"9476f1f4-61fa-4d56-a54b-cf28db2e0d47\") " pod="openstack/keystone-cron-29320681-l9b2f" Sep 30 14:01:00 crc kubenswrapper[4936]: I0930 14:01:00.366051 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-fernet-keys\") pod \"keystone-cron-29320681-l9b2f\" (UID: \"9476f1f4-61fa-4d56-a54b-cf28db2e0d47\") " pod="openstack/keystone-cron-29320681-l9b2f" Sep 30 14:01:00 crc kubenswrapper[4936]: I0930 14:01:00.373454 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p84c5\" (UniqueName: \"kubernetes.io/projected/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-kube-api-access-p84c5\") pod \"keystone-cron-29320681-l9b2f\" (UID: \"9476f1f4-61fa-4d56-a54b-cf28db2e0d47\") " pod="openstack/keystone-cron-29320681-l9b2f" Sep 30 14:01:00 crc kubenswrapper[4936]: I0930 14:01:00.501957 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320681-l9b2f" Sep 30 14:01:00 crc kubenswrapper[4936]: I0930 14:01:00.516256 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:01:01 crc kubenswrapper[4936]: I0930 14:01:01.103059 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320681-l9b2f"] Sep 30 14:01:01 crc kubenswrapper[4936]: I0930 14:01:01.235259 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320681-l9b2f" event={"ID":"9476f1f4-61fa-4d56-a54b-cf28db2e0d47","Type":"ContainerStarted","Data":"32152a7cf235e89035d1239b3bc64f8cade59072786e9016ca6aaeb1930e8995"} Sep 30 14:01:01 crc kubenswrapper[4936]: I0930 14:01:01.442558 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:01:02 crc kubenswrapper[4936]: I0930 14:01:02.262840 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320681-l9b2f" event={"ID":"9476f1f4-61fa-4d56-a54b-cf28db2e0d47","Type":"ContainerStarted","Data":"f456227807505f64ae09237b2da558ebf09fdd590d27078376acb8fa18f518ca"} Sep 30 14:01:02 crc kubenswrapper[4936]: I0930 14:01:02.294734 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29320681-l9b2f" podStartSLOduration=2.294711192 podStartE2EDuration="2.294711192s" podCreationTimestamp="2025-09-30 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:01:02.287389732 +0000 UTC m=+1312.671392033" watchObservedRunningTime="2025-09-30 14:01:02.294711192 +0000 UTC m=+1312.678713493" Sep 30 14:01:05 crc kubenswrapper[4936]: I0930 14:01:05.294749 4936 generic.go:334] "Generic (PLEG): container finished" podID="9476f1f4-61fa-4d56-a54b-cf28db2e0d47" containerID="f456227807505f64ae09237b2da558ebf09fdd590d27078376acb8fa18f518ca" exitCode=0 Sep 30 14:01:05 crc kubenswrapper[4936]: I0930 14:01:05.294832 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320681-l9b2f" event={"ID":"9476f1f4-61fa-4d56-a54b-cf28db2e0d47","Type":"ContainerDied","Data":"f456227807505f64ae09237b2da558ebf09fdd590d27078376acb8fa18f518ca"} Sep 30 14:01:05 crc kubenswrapper[4936]: I0930 14:01:05.919584 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="bf1fd592-e9a1-4f76-af38-961560e7b6f4" containerName="rabbitmq" containerID="cri-o://7fa365f8532602152ab7fa0f35c0ed5f4cb79fa1da22fba6027e4d281675a4f2" gracePeriod=604795 Sep 30 14:01:06 crc kubenswrapper[4936]: I0930 14:01:06.660822 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320681-l9b2f" Sep 30 14:01:06 crc kubenswrapper[4936]: I0930 14:01:06.690983 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="22002396-4cfa-4e41-95c0-61672072faa0" containerName="rabbitmq" containerID="cri-o://ca6808b9356155232c4ee79ce1b52d14482f1b8a9862c1518cb160909a72f3a0" gracePeriod=604795 Sep 30 14:01:06 crc kubenswrapper[4936]: I0930 14:01:06.764856 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p84c5\" (UniqueName: \"kubernetes.io/projected/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-kube-api-access-p84c5\") pod \"9476f1f4-61fa-4d56-a54b-cf28db2e0d47\" (UID: \"9476f1f4-61fa-4d56-a54b-cf28db2e0d47\") " Sep 30 14:01:06 crc kubenswrapper[4936]: I0930 14:01:06.764949 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-fernet-keys\") pod \"9476f1f4-61fa-4d56-a54b-cf28db2e0d47\" (UID: \"9476f1f4-61fa-4d56-a54b-cf28db2e0d47\") " Sep 30 14:01:06 crc kubenswrapper[4936]: I0930 14:01:06.765021 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-config-data\") pod \"9476f1f4-61fa-4d56-a54b-cf28db2e0d47\" (UID: \"9476f1f4-61fa-4d56-a54b-cf28db2e0d47\") " Sep 30 14:01:06 crc kubenswrapper[4936]: I0930 14:01:06.765144 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-combined-ca-bundle\") pod \"9476f1f4-61fa-4d56-a54b-cf28db2e0d47\" (UID: \"9476f1f4-61fa-4d56-a54b-cf28db2e0d47\") " Sep 30 14:01:06 crc kubenswrapper[4936]: I0930 14:01:06.783748 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9476f1f4-61fa-4d56-a54b-cf28db2e0d47" (UID: "9476f1f4-61fa-4d56-a54b-cf28db2e0d47"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:01:06 crc kubenswrapper[4936]: I0930 14:01:06.784898 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-kube-api-access-p84c5" (OuterVolumeSpecName: "kube-api-access-p84c5") pod "9476f1f4-61fa-4d56-a54b-cf28db2e0d47" (UID: "9476f1f4-61fa-4d56-a54b-cf28db2e0d47"). InnerVolumeSpecName "kube-api-access-p84c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:01:06 crc kubenswrapper[4936]: I0930 14:01:06.806092 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9476f1f4-61fa-4d56-a54b-cf28db2e0d47" (UID: "9476f1f4-61fa-4d56-a54b-cf28db2e0d47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:01:06 crc kubenswrapper[4936]: I0930 14:01:06.820812 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-config-data" (OuterVolumeSpecName: "config-data") pod "9476f1f4-61fa-4d56-a54b-cf28db2e0d47" (UID: "9476f1f4-61fa-4d56-a54b-cf28db2e0d47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:01:06 crc kubenswrapper[4936]: I0930 14:01:06.867621 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p84c5\" (UniqueName: \"kubernetes.io/projected/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-kube-api-access-p84c5\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:06 crc kubenswrapper[4936]: I0930 14:01:06.867667 4936 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:06 crc kubenswrapper[4936]: I0930 14:01:06.867682 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:06 crc kubenswrapper[4936]: I0930 14:01:06.867698 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9476f1f4-61fa-4d56-a54b-cf28db2e0d47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:07 crc kubenswrapper[4936]: I0930 14:01:07.313850 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320681-l9b2f" event={"ID":"9476f1f4-61fa-4d56-a54b-cf28db2e0d47","Type":"ContainerDied","Data":"32152a7cf235e89035d1239b3bc64f8cade59072786e9016ca6aaeb1930e8995"} Sep 30 14:01:07 crc kubenswrapper[4936]: I0930 14:01:07.314198 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32152a7cf235e89035d1239b3bc64f8cade59072786e9016ca6aaeb1930e8995" Sep 30 14:01:07 crc kubenswrapper[4936]: I0930 14:01:07.314378 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320681-l9b2f" Sep 30 14:01:09 crc kubenswrapper[4936]: I0930 14:01:09.620387 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="bf1fd592-e9a1-4f76-af38-961560e7b6f4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Sep 30 14:01:09 crc kubenswrapper[4936]: I0930 14:01:09.976771 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="22002396-4cfa-4e41-95c0-61672072faa0" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.355721 4936 generic.go:334] "Generic (PLEG): container finished" podID="bf1fd592-e9a1-4f76-af38-961560e7b6f4" containerID="7fa365f8532602152ab7fa0f35c0ed5f4cb79fa1da22fba6027e4d281675a4f2" exitCode=0 Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.356413 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bf1fd592-e9a1-4f76-af38-961560e7b6f4","Type":"ContainerDied","Data":"7fa365f8532602152ab7fa0f35c0ed5f4cb79fa1da22fba6027e4d281675a4f2"} Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.446196 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.565252 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-confd\") pod \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.565405 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf1fd592-e9a1-4f76-af38-961560e7b6f4-config-data\") pod \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.566132 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.566176 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf1fd592-e9a1-4f76-af38-961560e7b6f4-server-conf\") pod \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.566258 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf1fd592-e9a1-4f76-af38-961560e7b6f4-pod-info\") pod \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.566299 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf1fd592-e9a1-4f76-af38-961560e7b6f4-plugins-conf\") pod \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.566362 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-erlang-cookie\") pod \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.566388 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-plugins\") pod \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.566498 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-tls\") pod \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.566522 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rndm8\" (UniqueName: \"kubernetes.io/projected/bf1fd592-e9a1-4f76-af38-961560e7b6f4-kube-api-access-rndm8\") pod \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.566636 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf1fd592-e9a1-4f76-af38-961560e7b6f4-erlang-cookie-secret\") pod \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\" (UID: \"bf1fd592-e9a1-4f76-af38-961560e7b6f4\") " Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.569064 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1fd592-e9a1-4f76-af38-961560e7b6f4-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "bf1fd592-e9a1-4f76-af38-961560e7b6f4" (UID: "bf1fd592-e9a1-4f76-af38-961560e7b6f4"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.573114 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "bf1fd592-e9a1-4f76-af38-961560e7b6f4" (UID: "bf1fd592-e9a1-4f76-af38-961560e7b6f4"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.580193 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf1fd592-e9a1-4f76-af38-961560e7b6f4-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "bf1fd592-e9a1-4f76-af38-961560e7b6f4" (UID: "bf1fd592-e9a1-4f76-af38-961560e7b6f4"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.583299 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "bf1fd592-e9a1-4f76-af38-961560e7b6f4" (UID: "bf1fd592-e9a1-4f76-af38-961560e7b6f4"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.589559 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "bf1fd592-e9a1-4f76-af38-961560e7b6f4" (UID: "bf1fd592-e9a1-4f76-af38-961560e7b6f4"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.604224 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1fd592-e9a1-4f76-af38-961560e7b6f4-kube-api-access-rndm8" (OuterVolumeSpecName: "kube-api-access-rndm8") pod "bf1fd592-e9a1-4f76-af38-961560e7b6f4" (UID: "bf1fd592-e9a1-4f76-af38-961560e7b6f4"). InnerVolumeSpecName "kube-api-access-rndm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.604306 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bf1fd592-e9a1-4f76-af38-961560e7b6f4-pod-info" (OuterVolumeSpecName: "pod-info") pod "bf1fd592-e9a1-4f76-af38-961560e7b6f4" (UID: "bf1fd592-e9a1-4f76-af38-961560e7b6f4"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.611628 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "bf1fd592-e9a1-4f76-af38-961560e7b6f4" (UID: "bf1fd592-e9a1-4f76-af38-961560e7b6f4"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.631368 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1fd592-e9a1-4f76-af38-961560e7b6f4-config-data" (OuterVolumeSpecName: "config-data") pod "bf1fd592-e9a1-4f76-af38-961560e7b6f4" (UID: "bf1fd592-e9a1-4f76-af38-961560e7b6f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.657203 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1fd592-e9a1-4f76-af38-961560e7b6f4-server-conf" (OuterVolumeSpecName: "server-conf") pod "bf1fd592-e9a1-4f76-af38-961560e7b6f4" (UID: "bf1fd592-e9a1-4f76-af38-961560e7b6f4"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.669131 4936 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf1fd592-e9a1-4f76-af38-961560e7b6f4-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.669173 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf1fd592-e9a1-4f76-af38-961560e7b6f4-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.669209 4936 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.669221 4936 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf1fd592-e9a1-4f76-af38-961560e7b6f4-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.669231 4936 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf1fd592-e9a1-4f76-af38-961560e7b6f4-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.669240 4936 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf1fd592-e9a1-4f76-af38-961560e7b6f4-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.669250 4936 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.669261 4936 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.669273 4936 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.669283 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rndm8\" (UniqueName: \"kubernetes.io/projected/bf1fd592-e9a1-4f76-af38-961560e7b6f4-kube-api-access-rndm8\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.707499 4936 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.762224 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "bf1fd592-e9a1-4f76-af38-961560e7b6f4" (UID: "bf1fd592-e9a1-4f76-af38-961560e7b6f4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.770630 4936 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf1fd592-e9a1-4f76-af38-961560e7b6f4-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:12 crc kubenswrapper[4936]: I0930 14:01:12.770661 4936 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.252912 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.370449 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bf1fd592-e9a1-4f76-af38-961560e7b6f4","Type":"ContainerDied","Data":"dbc4eacb64c32abb45fe3aa87db4b771d654ca2ccc207750e6c69e96b637a188"} Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.370503 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.370524 4936 scope.go:117] "RemoveContainer" containerID="7fa365f8532602152ab7fa0f35c0ed5f4cb79fa1da22fba6027e4d281675a4f2" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.373546 4936 generic.go:334] "Generic (PLEG): container finished" podID="22002396-4cfa-4e41-95c0-61672072faa0" containerID="ca6808b9356155232c4ee79ce1b52d14482f1b8a9862c1518cb160909a72f3a0" exitCode=0 Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.373879 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"22002396-4cfa-4e41-95c0-61672072faa0","Type":"ContainerDied","Data":"ca6808b9356155232c4ee79ce1b52d14482f1b8a9862c1518cb160909a72f3a0"} Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.373909 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"22002396-4cfa-4e41-95c0-61672072faa0","Type":"ContainerDied","Data":"14dd3d3e163f55173611e09f1605703763195171e7e3d2c42c90cfb6cbf7c5dc"} Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.373967 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.388000 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22002396-4cfa-4e41-95c0-61672072faa0-config-data\") pod \"22002396-4cfa-4e41-95c0-61672072faa0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.388052 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/22002396-4cfa-4e41-95c0-61672072faa0-erlang-cookie-secret\") pod \"22002396-4cfa-4e41-95c0-61672072faa0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.388091 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"22002396-4cfa-4e41-95c0-61672072faa0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.388112 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-plugins\") pod \"22002396-4cfa-4e41-95c0-61672072faa0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.388163 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8dr7\" (UniqueName: \"kubernetes.io/projected/22002396-4cfa-4e41-95c0-61672072faa0-kube-api-access-v8dr7\") pod \"22002396-4cfa-4e41-95c0-61672072faa0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.388208 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/22002396-4cfa-4e41-95c0-61672072faa0-server-conf\") pod \"22002396-4cfa-4e41-95c0-61672072faa0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.388227 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/22002396-4cfa-4e41-95c0-61672072faa0-pod-info\") pod \"22002396-4cfa-4e41-95c0-61672072faa0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.388366 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/22002396-4cfa-4e41-95c0-61672072faa0-plugins-conf\") pod \"22002396-4cfa-4e41-95c0-61672072faa0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.388381 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-confd\") pod \"22002396-4cfa-4e41-95c0-61672072faa0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.388401 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-erlang-cookie\") pod \"22002396-4cfa-4e41-95c0-61672072faa0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.388438 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-tls\") pod \"22002396-4cfa-4e41-95c0-61672072faa0\" (UID: \"22002396-4cfa-4e41-95c0-61672072faa0\") " Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.401970 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "22002396-4cfa-4e41-95c0-61672072faa0" (UID: "22002396-4cfa-4e41-95c0-61672072faa0"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.407414 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22002396-4cfa-4e41-95c0-61672072faa0-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "22002396-4cfa-4e41-95c0-61672072faa0" (UID: "22002396-4cfa-4e41-95c0-61672072faa0"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.408796 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22002396-4cfa-4e41-95c0-61672072faa0-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "22002396-4cfa-4e41-95c0-61672072faa0" (UID: "22002396-4cfa-4e41-95c0-61672072faa0"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.410990 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "22002396-4cfa-4e41-95c0-61672072faa0" (UID: "22002396-4cfa-4e41-95c0-61672072faa0"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.411814 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "22002396-4cfa-4e41-95c0-61672072faa0" (UID: "22002396-4cfa-4e41-95c0-61672072faa0"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.412115 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/22002396-4cfa-4e41-95c0-61672072faa0-pod-info" (OuterVolumeSpecName: "pod-info") pod "22002396-4cfa-4e41-95c0-61672072faa0" (UID: "22002396-4cfa-4e41-95c0-61672072faa0"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.448511 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "22002396-4cfa-4e41-95c0-61672072faa0" (UID: "22002396-4cfa-4e41-95c0-61672072faa0"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.454678 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22002396-4cfa-4e41-95c0-61672072faa0-kube-api-access-v8dr7" (OuterVolumeSpecName: "kube-api-access-v8dr7") pod "22002396-4cfa-4e41-95c0-61672072faa0" (UID: "22002396-4cfa-4e41-95c0-61672072faa0"). InnerVolumeSpecName "kube-api-access-v8dr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.491451 4936 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.491533 4936 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.491545 4936 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/22002396-4cfa-4e41-95c0-61672072faa0-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.491592 4936 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.491606 4936 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.491617 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8dr7\" (UniqueName: \"kubernetes.io/projected/22002396-4cfa-4e41-95c0-61672072faa0-kube-api-access-v8dr7\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.491626 4936 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/22002396-4cfa-4e41-95c0-61672072faa0-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.491637 4936 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/22002396-4cfa-4e41-95c0-61672072faa0-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.492298 4936 scope.go:117] "RemoveContainer" containerID="7d23d36032bcd16c3f026a158ff8a7636fbe1c97e9216ccaf29dded344afc381" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.523237 4936 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.571822 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22002396-4cfa-4e41-95c0-61672072faa0-config-data" (OuterVolumeSpecName: "config-data") pod "22002396-4cfa-4e41-95c0-61672072faa0" (UID: "22002396-4cfa-4e41-95c0-61672072faa0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.583053 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.608069 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22002396-4cfa-4e41-95c0-61672072faa0-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.608108 4936 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.623497 4936 scope.go:117] "RemoveContainer" containerID="ca6808b9356155232c4ee79ce1b52d14482f1b8a9862c1518cb160909a72f3a0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.652835 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.674110 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:01:13 crc kubenswrapper[4936]: E0930 14:01:13.674524 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9476f1f4-61fa-4d56-a54b-cf28db2e0d47" containerName="keystone-cron" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.674541 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9476f1f4-61fa-4d56-a54b-cf28db2e0d47" containerName="keystone-cron" Sep 30 14:01:13 crc kubenswrapper[4936]: E0930 14:01:13.674557 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1fd592-e9a1-4f76-af38-961560e7b6f4" containerName="setup-container" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.674564 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1fd592-e9a1-4f76-af38-961560e7b6f4" containerName="setup-container" Sep 30 14:01:13 crc kubenswrapper[4936]: E0930 14:01:13.674580 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22002396-4cfa-4e41-95c0-61672072faa0" containerName="setup-container" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.674587 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="22002396-4cfa-4e41-95c0-61672072faa0" containerName="setup-container" Sep 30 14:01:13 crc kubenswrapper[4936]: E0930 14:01:13.674604 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1fd592-e9a1-4f76-af38-961560e7b6f4" containerName="rabbitmq" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.674610 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1fd592-e9a1-4f76-af38-961560e7b6f4" containerName="rabbitmq" Sep 30 14:01:13 crc kubenswrapper[4936]: E0930 14:01:13.674627 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22002396-4cfa-4e41-95c0-61672072faa0" containerName="rabbitmq" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.674632 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="22002396-4cfa-4e41-95c0-61672072faa0" containerName="rabbitmq" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.674806 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="22002396-4cfa-4e41-95c0-61672072faa0" containerName="rabbitmq" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.674821 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="9476f1f4-61fa-4d56-a54b-cf28db2e0d47" containerName="keystone-cron" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.674832 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf1fd592-e9a1-4f76-af38-961560e7b6f4" containerName="rabbitmq" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.675880 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.686875 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.687116 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.687268 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.689737 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.690637 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6llk4" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.690808 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.690966 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.702553 4936 scope.go:117] "RemoveContainer" containerID="3d9e28cd2db0d2fe92085ea29d9df2b8dbd3e6ebb03e32c690054a5b2c16fdac" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.704817 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "22002396-4cfa-4e41-95c0-61672072faa0" (UID: "22002396-4cfa-4e41-95c0-61672072faa0"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.711527 4936 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/22002396-4cfa-4e41-95c0-61672072faa0-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.725671 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.742367 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22002396-4cfa-4e41-95c0-61672072faa0-server-conf" (OuterVolumeSpecName: "server-conf") pod "22002396-4cfa-4e41-95c0-61672072faa0" (UID: "22002396-4cfa-4e41-95c0-61672072faa0"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.766168 4936 scope.go:117] "RemoveContainer" containerID="ca6808b9356155232c4ee79ce1b52d14482f1b8a9862c1518cb160909a72f3a0" Sep 30 14:01:13 crc kubenswrapper[4936]: E0930 14:01:13.768145 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca6808b9356155232c4ee79ce1b52d14482f1b8a9862c1518cb160909a72f3a0\": container with ID starting with ca6808b9356155232c4ee79ce1b52d14482f1b8a9862c1518cb160909a72f3a0 not found: ID does not exist" containerID="ca6808b9356155232c4ee79ce1b52d14482f1b8a9862c1518cb160909a72f3a0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.768185 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca6808b9356155232c4ee79ce1b52d14482f1b8a9862c1518cb160909a72f3a0"} err="failed to get container status \"ca6808b9356155232c4ee79ce1b52d14482f1b8a9862c1518cb160909a72f3a0\": rpc error: code = NotFound desc = could not find container \"ca6808b9356155232c4ee79ce1b52d14482f1b8a9862c1518cb160909a72f3a0\": container with ID starting with ca6808b9356155232c4ee79ce1b52d14482f1b8a9862c1518cb160909a72f3a0 not found: ID does not exist" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.768214 4936 scope.go:117] "RemoveContainer" containerID="3d9e28cd2db0d2fe92085ea29d9df2b8dbd3e6ebb03e32c690054a5b2c16fdac" Sep 30 14:01:13 crc kubenswrapper[4936]: E0930 14:01:13.769212 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d9e28cd2db0d2fe92085ea29d9df2b8dbd3e6ebb03e32c690054a5b2c16fdac\": container with ID starting with 3d9e28cd2db0d2fe92085ea29d9df2b8dbd3e6ebb03e32c690054a5b2c16fdac not found: ID does not exist" containerID="3d9e28cd2db0d2fe92085ea29d9df2b8dbd3e6ebb03e32c690054a5b2c16fdac" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.769279 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d9e28cd2db0d2fe92085ea29d9df2b8dbd3e6ebb03e32c690054a5b2c16fdac"} err="failed to get container status \"3d9e28cd2db0d2fe92085ea29d9df2b8dbd3e6ebb03e32c690054a5b2c16fdac\": rpc error: code = NotFound desc = could not find container \"3d9e28cd2db0d2fe92085ea29d9df2b8dbd3e6ebb03e32c690054a5b2c16fdac\": container with ID starting with 3d9e28cd2db0d2fe92085ea29d9df2b8dbd3e6ebb03e32c690054a5b2c16fdac not found: ID does not exist" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.813302 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fd17158a-07d3-477e-8aa6-d03c3cb277c8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.813365 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fd17158a-07d3-477e-8aa6-d03c3cb277c8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.813387 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fd17158a-07d3-477e-8aa6-d03c3cb277c8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.813410 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fd17158a-07d3-477e-8aa6-d03c3cb277c8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.813434 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vnlj\" (UniqueName: \"kubernetes.io/projected/fd17158a-07d3-477e-8aa6-d03c3cb277c8-kube-api-access-4vnlj\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.813470 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fd17158a-07d3-477e-8aa6-d03c3cb277c8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.813488 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fd17158a-07d3-477e-8aa6-d03c3cb277c8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.813538 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.813594 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fd17158a-07d3-477e-8aa6-d03c3cb277c8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.813842 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd17158a-07d3-477e-8aa6-d03c3cb277c8-config-data\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.813913 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fd17158a-07d3-477e-8aa6-d03c3cb277c8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.814037 4936 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/22002396-4cfa-4e41-95c0-61672072faa0-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.915773 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fd17158a-07d3-477e-8aa6-d03c3cb277c8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.915868 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd17158a-07d3-477e-8aa6-d03c3cb277c8-config-data\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.915899 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fd17158a-07d3-477e-8aa6-d03c3cb277c8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.915948 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fd17158a-07d3-477e-8aa6-d03c3cb277c8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.915972 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fd17158a-07d3-477e-8aa6-d03c3cb277c8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.916158 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fd17158a-07d3-477e-8aa6-d03c3cb277c8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.916525 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fd17158a-07d3-477e-8aa6-d03c3cb277c8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.916750 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fd17158a-07d3-477e-8aa6-d03c3cb277c8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.916993 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd17158a-07d3-477e-8aa6-d03c3cb277c8-config-data\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.917086 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fd17158a-07d3-477e-8aa6-d03c3cb277c8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.917660 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fd17158a-07d3-477e-8aa6-d03c3cb277c8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.916190 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fd17158a-07d3-477e-8aa6-d03c3cb277c8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.917772 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vnlj\" (UniqueName: \"kubernetes.io/projected/fd17158a-07d3-477e-8aa6-d03c3cb277c8-kube-api-access-4vnlj\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.917813 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fd17158a-07d3-477e-8aa6-d03c3cb277c8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.917831 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fd17158a-07d3-477e-8aa6-d03c3cb277c8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.918203 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.918393 4936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.919416 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fd17158a-07d3-477e-8aa6-d03c3cb277c8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.919741 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fd17158a-07d3-477e-8aa6-d03c3cb277c8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.922270 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fd17158a-07d3-477e-8aa6-d03c3cb277c8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.927679 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fd17158a-07d3-477e-8aa6-d03c3cb277c8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.936091 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vnlj\" (UniqueName: \"kubernetes.io/projected/fd17158a-07d3-477e-8aa6-d03c3cb277c8-kube-api-access-4vnlj\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:13 crc kubenswrapper[4936]: I0930 14:01:13.958654 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"fd17158a-07d3-477e-8aa6-d03c3cb277c8\") " pod="openstack/rabbitmq-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.012224 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.028895 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.044248 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.046053 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.049457 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.049468 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.050431 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.050644 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.051262 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.051543 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.051792 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-w7g9k" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.052085 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.053005 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.121064 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ba18f440-0c9a-45d0-a1de-9f363bc654cf-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.121151 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ba18f440-0c9a-45d0-a1de-9f363bc654cf-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.121191 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ba18f440-0c9a-45d0-a1de-9f363bc654cf-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.121216 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ba18f440-0c9a-45d0-a1de-9f363bc654cf-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.121247 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ba18f440-0c9a-45d0-a1de-9f363bc654cf-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.121348 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ba18f440-0c9a-45d0-a1de-9f363bc654cf-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.121387 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk6bb\" (UniqueName: \"kubernetes.io/projected/ba18f440-0c9a-45d0-a1de-9f363bc654cf-kube-api-access-rk6bb\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.121428 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba18f440-0c9a-45d0-a1de-9f363bc654cf-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.121481 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ba18f440-0c9a-45d0-a1de-9f363bc654cf-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.121505 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ba18f440-0c9a-45d0-a1de-9f363bc654cf-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.121550 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.223237 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.223627 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ba18f440-0c9a-45d0-a1de-9f363bc654cf-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.223694 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ba18f440-0c9a-45d0-a1de-9f363bc654cf-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.223725 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ba18f440-0c9a-45d0-a1de-9f363bc654cf-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.223747 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ba18f440-0c9a-45d0-a1de-9f363bc654cf-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.223782 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ba18f440-0c9a-45d0-a1de-9f363bc654cf-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.223833 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ba18f440-0c9a-45d0-a1de-9f363bc654cf-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.223863 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk6bb\" (UniqueName: \"kubernetes.io/projected/ba18f440-0c9a-45d0-a1de-9f363bc654cf-kube-api-access-rk6bb\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.223889 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba18f440-0c9a-45d0-a1de-9f363bc654cf-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.223913 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ba18f440-0c9a-45d0-a1de-9f363bc654cf-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.223930 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ba18f440-0c9a-45d0-a1de-9f363bc654cf-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.224458 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ba18f440-0c9a-45d0-a1de-9f363bc654cf-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.224515 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ba18f440-0c9a-45d0-a1de-9f363bc654cf-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.223488 4936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.225632 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ba18f440-0c9a-45d0-a1de-9f363bc654cf-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.229927 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ba18f440-0c9a-45d0-a1de-9f363bc654cf-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.229991 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ba18f440-0c9a-45d0-a1de-9f363bc654cf-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.230555 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ba18f440-0c9a-45d0-a1de-9f363bc654cf-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.230767 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba18f440-0c9a-45d0-a1de-9f363bc654cf-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.231005 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ba18f440-0c9a-45d0-a1de-9f363bc654cf-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.246287 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ba18f440-0c9a-45d0-a1de-9f363bc654cf-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.277541 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk6bb\" (UniqueName: \"kubernetes.io/projected/ba18f440-0c9a-45d0-a1de-9f363bc654cf-kube-api-access-rk6bb\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.278750 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ba18f440-0c9a-45d0-a1de-9f363bc654cf\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: W0930 14:01:14.363063 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd17158a_07d3_477e_8aa6_d03c3cb277c8.slice/crio-72e26a9c0d2b28ed3c39163a2eb4d89029b895752fa4dcd90182478cc491f4b7 WatchSource:0}: Error finding container 72e26a9c0d2b28ed3c39163a2eb4d89029b895752fa4dcd90182478cc491f4b7: Status 404 returned error can't find the container with id 72e26a9c0d2b28ed3c39163a2eb4d89029b895752fa4dcd90182478cc491f4b7 Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.369315 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22002396-4cfa-4e41-95c0-61672072faa0" path="/var/lib/kubelet/pods/22002396-4cfa-4e41-95c0-61672072faa0/volumes" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.373297 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf1fd592-e9a1-4f76-af38-961560e7b6f4" path="/var/lib/kubelet/pods/bf1fd592-e9a1-4f76-af38-961560e7b6f4/volumes" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.374320 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.374636 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.384937 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fd17158a-07d3-477e-8aa6-d03c3cb277c8","Type":"ContainerStarted","Data":"72e26a9c0d2b28ed3c39163a2eb4d89029b895752fa4dcd90182478cc491f4b7"} Sep 30 14:01:14 crc kubenswrapper[4936]: I0930 14:01:14.873420 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:01:14 crc kubenswrapper[4936]: W0930 14:01:14.875344 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba18f440_0c9a_45d0_a1de_9f363bc654cf.slice/crio-0f2a1131bf291a6bd537b552ec05729b57301db1953f809d873da6694347f618 WatchSource:0}: Error finding container 0f2a1131bf291a6bd537b552ec05729b57301db1953f809d873da6694347f618: Status 404 returned error can't find the container with id 0f2a1131bf291a6bd537b552ec05729b57301db1953f809d873da6694347f618 Sep 30 14:01:15 crc kubenswrapper[4936]: I0930 14:01:15.395624 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fd17158a-07d3-477e-8aa6-d03c3cb277c8","Type":"ContainerStarted","Data":"06d5249811691e1104dc2b1046431c9a31bdc5b8456e117afbe9c633d61a4fac"} Sep 30 14:01:15 crc kubenswrapper[4936]: I0930 14:01:15.397037 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ba18f440-0c9a-45d0-a1de-9f363bc654cf","Type":"ContainerStarted","Data":"ed2d21a1916fab3730fbafb260f62f625d75eed589a6210b29bcf5dc5e17f2a0"} Sep 30 14:01:15 crc kubenswrapper[4936]: I0930 14:01:15.397088 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ba18f440-0c9a-45d0-a1de-9f363bc654cf","Type":"ContainerStarted","Data":"0f2a1131bf291a6bd537b552ec05729b57301db1953f809d873da6694347f618"} Sep 30 14:01:15 crc kubenswrapper[4936]: I0930 14:01:15.975769 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-dlshd"] Sep 30 14:01:15 crc kubenswrapper[4936]: I0930 14:01:15.977808 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:15 crc kubenswrapper[4936]: I0930 14:01:15.983056 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Sep 30 14:01:15 crc kubenswrapper[4936]: I0930 14:01:15.988877 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-dlshd"] Sep 30 14:01:16 crc kubenswrapper[4936]: I0930 14:01:16.067554 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crb5p\" (UniqueName: \"kubernetes.io/projected/44a6a678-07bb-4a42-bc3a-da653fe9b529-kube-api-access-crb5p\") pod \"dnsmasq-dns-6447ccbd8f-dlshd\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:16 crc kubenswrapper[4936]: I0930 14:01:16.067625 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-dlshd\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:16 crc kubenswrapper[4936]: I0930 14:01:16.067706 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-config\") pod \"dnsmasq-dns-6447ccbd8f-dlshd\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:16 crc kubenswrapper[4936]: I0930 14:01:16.067764 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-dlshd\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:16 crc kubenswrapper[4936]: I0930 14:01:16.067851 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-dlshd\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:16 crc kubenswrapper[4936]: I0930 14:01:16.067948 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-dlshd\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:16 crc kubenswrapper[4936]: I0930 14:01:16.170045 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crb5p\" (UniqueName: \"kubernetes.io/projected/44a6a678-07bb-4a42-bc3a-da653fe9b529-kube-api-access-crb5p\") pod \"dnsmasq-dns-6447ccbd8f-dlshd\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:16 crc kubenswrapper[4936]: I0930 14:01:16.170107 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-dlshd\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:16 crc kubenswrapper[4936]: I0930 14:01:16.170154 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-config\") pod \"dnsmasq-dns-6447ccbd8f-dlshd\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:16 crc kubenswrapper[4936]: I0930 14:01:16.170181 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-dlshd\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:16 crc kubenswrapper[4936]: I0930 14:01:16.170239 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-dlshd\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:16 crc kubenswrapper[4936]: I0930 14:01:16.170277 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-dlshd\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:16 crc kubenswrapper[4936]: I0930 14:01:16.171121 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-dlshd\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:16 crc kubenswrapper[4936]: I0930 14:01:16.171134 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-dlshd\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:16 crc kubenswrapper[4936]: I0930 14:01:16.171168 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-config\") pod \"dnsmasq-dns-6447ccbd8f-dlshd\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:16 crc kubenswrapper[4936]: I0930 14:01:16.171298 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-dlshd\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:16 crc kubenswrapper[4936]: I0930 14:01:16.171793 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-dlshd\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:16 crc kubenswrapper[4936]: I0930 14:01:16.188461 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crb5p\" (UniqueName: \"kubernetes.io/projected/44a6a678-07bb-4a42-bc3a-da653fe9b529-kube-api-access-crb5p\") pod \"dnsmasq-dns-6447ccbd8f-dlshd\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:16 crc kubenswrapper[4936]: I0930 14:01:16.315453 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:16 crc kubenswrapper[4936]: I0930 14:01:16.775914 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-dlshd"] Sep 30 14:01:16 crc kubenswrapper[4936]: W0930 14:01:16.805253 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44a6a678_07bb_4a42_bc3a_da653fe9b529.slice/crio-94bcf218a40fa5ce2595e4eced2eee6f426067930de6491c60eb80a01103b3cd WatchSource:0}: Error finding container 94bcf218a40fa5ce2595e4eced2eee6f426067930de6491c60eb80a01103b3cd: Status 404 returned error can't find the container with id 94bcf218a40fa5ce2595e4eced2eee6f426067930de6491c60eb80a01103b3cd Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.125367 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w"] Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.126852 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.132146 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.132710 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.133165 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.133481 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.141075 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w"] Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.212007 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc7dn\" (UniqueName: \"kubernetes.io/projected/98c83542-c990-48c0-8113-17bd96de3cc0-kube-api-access-dc7dn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w\" (UID: \"98c83542-c990-48c0-8113-17bd96de3cc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.212250 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98c83542-c990-48c0-8113-17bd96de3cc0-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w\" (UID: \"98c83542-c990-48c0-8113-17bd96de3cc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.212313 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98c83542-c990-48c0-8113-17bd96de3cc0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w\" (UID: \"98c83542-c990-48c0-8113-17bd96de3cc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.212498 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c83542-c990-48c0-8113-17bd96de3cc0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w\" (UID: \"98c83542-c990-48c0-8113-17bd96de3cc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.313887 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c83542-c990-48c0-8113-17bd96de3cc0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w\" (UID: \"98c83542-c990-48c0-8113-17bd96de3cc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.313971 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc7dn\" (UniqueName: \"kubernetes.io/projected/98c83542-c990-48c0-8113-17bd96de3cc0-kube-api-access-dc7dn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w\" (UID: \"98c83542-c990-48c0-8113-17bd96de3cc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.314090 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98c83542-c990-48c0-8113-17bd96de3cc0-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w\" (UID: \"98c83542-c990-48c0-8113-17bd96de3cc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.314117 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98c83542-c990-48c0-8113-17bd96de3cc0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w\" (UID: \"98c83542-c990-48c0-8113-17bd96de3cc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.318823 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98c83542-c990-48c0-8113-17bd96de3cc0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w\" (UID: \"98c83542-c990-48c0-8113-17bd96de3cc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.319009 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98c83542-c990-48c0-8113-17bd96de3cc0-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w\" (UID: \"98c83542-c990-48c0-8113-17bd96de3cc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.328223 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c83542-c990-48c0-8113-17bd96de3cc0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w\" (UID: \"98c83542-c990-48c0-8113-17bd96de3cc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.331280 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc7dn\" (UniqueName: \"kubernetes.io/projected/98c83542-c990-48c0-8113-17bd96de3cc0-kube-api-access-dc7dn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w\" (UID: \"98c83542-c990-48c0-8113-17bd96de3cc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.422111 4936 generic.go:334] "Generic (PLEG): container finished" podID="44a6a678-07bb-4a42-bc3a-da653fe9b529" containerID="6b740ed19831e19bce12edd26193e1c11ee7e56755429a0d55031db472c56a89" exitCode=0 Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.422157 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" event={"ID":"44a6a678-07bb-4a42-bc3a-da653fe9b529","Type":"ContainerDied","Data":"6b740ed19831e19bce12edd26193e1c11ee7e56755429a0d55031db472c56a89"} Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.422184 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" event={"ID":"44a6a678-07bb-4a42-bc3a-da653fe9b529","Type":"ContainerStarted","Data":"94bcf218a40fa5ce2595e4eced2eee6f426067930de6491c60eb80a01103b3cd"} Sep 30 14:01:17 crc kubenswrapper[4936]: I0930 14:01:17.467917 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" Sep 30 14:01:18 crc kubenswrapper[4936]: I0930 14:01:18.025692 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w"] Sep 30 14:01:18 crc kubenswrapper[4936]: W0930 14:01:18.031416 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98c83542_c990_48c0_8113_17bd96de3cc0.slice/crio-67732c3e1660aae64905e052214ea890d1a3a080302db7fb8073ff430ff2cb63 WatchSource:0}: Error finding container 67732c3e1660aae64905e052214ea890d1a3a080302db7fb8073ff430ff2cb63: Status 404 returned error can't find the container with id 67732c3e1660aae64905e052214ea890d1a3a080302db7fb8073ff430ff2cb63 Sep 30 14:01:18 crc kubenswrapper[4936]: I0930 14:01:18.249852 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:01:18 crc kubenswrapper[4936]: I0930 14:01:18.250214 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:01:18 crc kubenswrapper[4936]: I0930 14:01:18.430593 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" event={"ID":"98c83542-c990-48c0-8113-17bd96de3cc0","Type":"ContainerStarted","Data":"67732c3e1660aae64905e052214ea890d1a3a080302db7fb8073ff430ff2cb63"} Sep 30 14:01:18 crc kubenswrapper[4936]: I0930 14:01:18.438828 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" event={"ID":"44a6a678-07bb-4a42-bc3a-da653fe9b529","Type":"ContainerStarted","Data":"ba1d09e2840f436d297d949dd8a774e31d8df42d079d1da6d194099adfad27c4"} Sep 30 14:01:18 crc kubenswrapper[4936]: I0930 14:01:18.439136 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:18 crc kubenswrapper[4936]: I0930 14:01:18.460181 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" podStartSLOduration=3.460164047 podStartE2EDuration="3.460164047s" podCreationTimestamp="2025-09-30 14:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:01:18.458206204 +0000 UTC m=+1328.842208515" watchObservedRunningTime="2025-09-30 14:01:18.460164047 +0000 UTC m=+1328.844166348" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.327002 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.393019 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-jkmjc"] Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.393298 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" podUID="9963394e-63e7-402f-8c8c-0a4a9b1f0e9e" containerName="dnsmasq-dns" containerID="cri-o://a5d810c6d3904389881d6c3c64e729fa908389da3275cb0713b12784c0504c75" gracePeriod=10 Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.577424 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fb68d687f-6n4nt"] Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.579050 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.617606 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb68d687f-6n4nt"] Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.708838 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-config\") pod \"dnsmasq-dns-fb68d687f-6n4nt\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.708910 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-ovsdbserver-nb\") pod \"dnsmasq-dns-fb68d687f-6n4nt\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.708935 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-ovsdbserver-sb\") pod \"dnsmasq-dns-fb68d687f-6n4nt\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.708964 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-openstack-edpm-ipam\") pod \"dnsmasq-dns-fb68d687f-6n4nt\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.709018 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ths7\" (UniqueName: \"kubernetes.io/projected/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-kube-api-access-4ths7\") pod \"dnsmasq-dns-fb68d687f-6n4nt\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.709135 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-dns-svc\") pod \"dnsmasq-dns-fb68d687f-6n4nt\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.811115 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-dns-svc\") pod \"dnsmasq-dns-fb68d687f-6n4nt\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.811180 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-config\") pod \"dnsmasq-dns-fb68d687f-6n4nt\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.811206 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-ovsdbserver-nb\") pod \"dnsmasq-dns-fb68d687f-6n4nt\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.811226 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-ovsdbserver-sb\") pod \"dnsmasq-dns-fb68d687f-6n4nt\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.811256 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-openstack-edpm-ipam\") pod \"dnsmasq-dns-fb68d687f-6n4nt\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.811304 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ths7\" (UniqueName: \"kubernetes.io/projected/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-kube-api-access-4ths7\") pod \"dnsmasq-dns-fb68d687f-6n4nt\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.812735 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-config\") pod \"dnsmasq-dns-fb68d687f-6n4nt\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.813057 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-ovsdbserver-sb\") pod \"dnsmasq-dns-fb68d687f-6n4nt\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.813534 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-dns-svc\") pod \"dnsmasq-dns-fb68d687f-6n4nt\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.813907 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-ovsdbserver-nb\") pod \"dnsmasq-dns-fb68d687f-6n4nt\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.814594 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-openstack-edpm-ipam\") pod \"dnsmasq-dns-fb68d687f-6n4nt\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.830462 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ths7\" (UniqueName: \"kubernetes.io/projected/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-kube-api-access-4ths7\") pod \"dnsmasq-dns-fb68d687f-6n4nt\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:26 crc kubenswrapper[4936]: I0930 14:01:26.904886 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:27 crc kubenswrapper[4936]: I0930 14:01:27.534879 4936 generic.go:334] "Generic (PLEG): container finished" podID="9963394e-63e7-402f-8c8c-0a4a9b1f0e9e" containerID="a5d810c6d3904389881d6c3c64e729fa908389da3275cb0713b12784c0504c75" exitCode=0 Sep 30 14:01:27 crc kubenswrapper[4936]: I0930 14:01:27.534917 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" event={"ID":"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e","Type":"ContainerDied","Data":"a5d810c6d3904389881d6c3c64e729fa908389da3275cb0713b12784c0504c75"} Sep 30 14:01:29 crc kubenswrapper[4936]: I0930 14:01:29.201983 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" podUID="9963394e-63e7-402f-8c8c-0a4a9b1f0e9e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.183:5353: connect: connection refused" Sep 30 14:01:30 crc kubenswrapper[4936]: I0930 14:01:30.621243 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:01:30 crc kubenswrapper[4936]: I0930 14:01:30.725037 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-ovsdbserver-sb\") pod \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\" (UID: \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\") " Sep 30 14:01:30 crc kubenswrapper[4936]: I0930 14:01:30.725215 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-ovsdbserver-nb\") pod \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\" (UID: \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\") " Sep 30 14:01:30 crc kubenswrapper[4936]: I0930 14:01:30.725248 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-dns-svc\") pod \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\" (UID: \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\") " Sep 30 14:01:30 crc kubenswrapper[4936]: I0930 14:01:30.725268 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q9ph\" (UniqueName: \"kubernetes.io/projected/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-kube-api-access-2q9ph\") pod \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\" (UID: \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\") " Sep 30 14:01:30 crc kubenswrapper[4936]: I0930 14:01:30.725305 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-config\") pod \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\" (UID: \"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e\") " Sep 30 14:01:30 crc kubenswrapper[4936]: I0930 14:01:30.764582 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-kube-api-access-2q9ph" (OuterVolumeSpecName: "kube-api-access-2q9ph") pod "9963394e-63e7-402f-8c8c-0a4a9b1f0e9e" (UID: "9963394e-63e7-402f-8c8c-0a4a9b1f0e9e"). InnerVolumeSpecName "kube-api-access-2q9ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:01:30 crc kubenswrapper[4936]: I0930 14:01:30.827500 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q9ph\" (UniqueName: \"kubernetes.io/projected/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-kube-api-access-2q9ph\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:30 crc kubenswrapper[4936]: I0930 14:01:30.902440 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9963394e-63e7-402f-8c8c-0a4a9b1f0e9e" (UID: "9963394e-63e7-402f-8c8c-0a4a9b1f0e9e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:01:30 crc kubenswrapper[4936]: I0930 14:01:30.910191 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9963394e-63e7-402f-8c8c-0a4a9b1f0e9e" (UID: "9963394e-63e7-402f-8c8c-0a4a9b1f0e9e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:01:30 crc kubenswrapper[4936]: I0930 14:01:30.912893 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-config" (OuterVolumeSpecName: "config") pod "9963394e-63e7-402f-8c8c-0a4a9b1f0e9e" (UID: "9963394e-63e7-402f-8c8c-0a4a9b1f0e9e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:01:30 crc kubenswrapper[4936]: I0930 14:01:30.920997 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9963394e-63e7-402f-8c8c-0a4a9b1f0e9e" (UID: "9963394e-63e7-402f-8c8c-0a4a9b1f0e9e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:01:30 crc kubenswrapper[4936]: I0930 14:01:30.929910 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:30 crc kubenswrapper[4936]: I0930 14:01:30.930098 4936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:30 crc kubenswrapper[4936]: I0930 14:01:30.930170 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:30 crc kubenswrapper[4936]: I0930 14:01:30.930228 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:31 crc kubenswrapper[4936]: I0930 14:01:31.024723 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb68d687f-6n4nt"] Sep 30 14:01:31 crc kubenswrapper[4936]: W0930 14:01:31.034688 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f86ab2e_56dc_4b7a_a3a3_b69b0922932e.slice/crio-de759f2acafd06490f3a8733642e87af6b60436f3d369a0ec0deefa1ac064979 WatchSource:0}: Error finding container de759f2acafd06490f3a8733642e87af6b60436f3d369a0ec0deefa1ac064979: Status 404 returned error can't find the container with id de759f2acafd06490f3a8733642e87af6b60436f3d369a0ec0deefa1ac064979 Sep 30 14:01:31 crc kubenswrapper[4936]: I0930 14:01:31.589369 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" event={"ID":"9963394e-63e7-402f-8c8c-0a4a9b1f0e9e","Type":"ContainerDied","Data":"951d5d505ffee6bd0df0e0acbf82980c51b354aaf87bb8912307d3f764531af6"} Sep 30 14:01:31 crc kubenswrapper[4936]: I0930 14:01:31.589709 4936 scope.go:117] "RemoveContainer" containerID="a5d810c6d3904389881d6c3c64e729fa908389da3275cb0713b12784c0504c75" Sep 30 14:01:31 crc kubenswrapper[4936]: I0930 14:01:31.589662 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-jkmjc" Sep 30 14:01:31 crc kubenswrapper[4936]: I0930 14:01:31.593058 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" event={"ID":"98c83542-c990-48c0-8113-17bd96de3cc0","Type":"ContainerStarted","Data":"9fe64979b53c159f6a00c90b57a10131a14059975709a0e6c31ca9b79c717e0b"} Sep 30 14:01:31 crc kubenswrapper[4936]: I0930 14:01:31.599038 4936 generic.go:334] "Generic (PLEG): container finished" podID="1f86ab2e-56dc-4b7a-a3a3-b69b0922932e" containerID="3e061a13e9108cd23c7d0880477eb2b818d01b7f2d24f488329385a81bef2a7d" exitCode=0 Sep 30 14:01:31 crc kubenswrapper[4936]: I0930 14:01:31.599077 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" event={"ID":"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e","Type":"ContainerDied","Data":"3e061a13e9108cd23c7d0880477eb2b818d01b7f2d24f488329385a81bef2a7d"} Sep 30 14:01:31 crc kubenswrapper[4936]: I0930 14:01:31.599103 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" event={"ID":"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e","Type":"ContainerStarted","Data":"de759f2acafd06490f3a8733642e87af6b60436f3d369a0ec0deefa1ac064979"} Sep 30 14:01:31 crc kubenswrapper[4936]: I0930 14:01:31.619542 4936 scope.go:117] "RemoveContainer" containerID="2617c80c17839a861ff2fb3497bbc7deb07b8abf88ed30b2c5cbfca6788ca1be" Sep 30 14:01:31 crc kubenswrapper[4936]: I0930 14:01:31.636347 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" podStartSLOduration=2.2796662420000002 podStartE2EDuration="14.636311731s" podCreationTimestamp="2025-09-30 14:01:17 +0000 UTC" firstStartedPulling="2025-09-30 14:01:18.033497883 +0000 UTC m=+1328.417500204" lastFinishedPulling="2025-09-30 14:01:30.390143392 +0000 UTC m=+1340.774145693" observedRunningTime="2025-09-30 14:01:31.620325124 +0000 UTC m=+1342.004327425" watchObservedRunningTime="2025-09-30 14:01:31.636311731 +0000 UTC m=+1342.020314032" Sep 30 14:01:31 crc kubenswrapper[4936]: I0930 14:01:31.774635 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-jkmjc"] Sep 30 14:01:31 crc kubenswrapper[4936]: I0930 14:01:31.784984 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-jkmjc"] Sep 30 14:01:32 crc kubenswrapper[4936]: I0930 14:01:32.324818 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9963394e-63e7-402f-8c8c-0a4a9b1f0e9e" path="/var/lib/kubelet/pods/9963394e-63e7-402f-8c8c-0a4a9b1f0e9e/volumes" Sep 30 14:01:32 crc kubenswrapper[4936]: I0930 14:01:32.610726 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" event={"ID":"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e","Type":"ContainerStarted","Data":"3c4879fc349049e199522214d8a42998ffe40ef53b5b6beacbf32c5d51253673"} Sep 30 14:01:32 crc kubenswrapper[4936]: I0930 14:01:32.611019 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:32 crc kubenswrapper[4936]: I0930 14:01:32.638000 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" podStartSLOduration=6.637975421 podStartE2EDuration="6.637975421s" podCreationTimestamp="2025-09-30 14:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:01:32.631904365 +0000 UTC m=+1343.015906666" watchObservedRunningTime="2025-09-30 14:01:32.637975421 +0000 UTC m=+1343.021977722" Sep 30 14:01:36 crc kubenswrapper[4936]: I0930 14:01:36.907524 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:01:36 crc kubenswrapper[4936]: I0930 14:01:36.966548 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-dlshd"] Sep 30 14:01:36 crc kubenswrapper[4936]: I0930 14:01:36.966773 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" podUID="44a6a678-07bb-4a42-bc3a-da653fe9b529" containerName="dnsmasq-dns" containerID="cri-o://ba1d09e2840f436d297d949dd8a774e31d8df42d079d1da6d194099adfad27c4" gracePeriod=10 Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.553924 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.649553 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crb5p\" (UniqueName: \"kubernetes.io/projected/44a6a678-07bb-4a42-bc3a-da653fe9b529-kube-api-access-crb5p\") pod \"44a6a678-07bb-4a42-bc3a-da653fe9b529\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.649672 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-ovsdbserver-nb\") pod \"44a6a678-07bb-4a42-bc3a-da653fe9b529\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.649702 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-openstack-edpm-ipam\") pod \"44a6a678-07bb-4a42-bc3a-da653fe9b529\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.649806 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-ovsdbserver-sb\") pod \"44a6a678-07bb-4a42-bc3a-da653fe9b529\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.649908 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-config\") pod \"44a6a678-07bb-4a42-bc3a-da653fe9b529\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.649993 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-dns-svc\") pod \"44a6a678-07bb-4a42-bc3a-da653fe9b529\" (UID: \"44a6a678-07bb-4a42-bc3a-da653fe9b529\") " Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.652358 4936 generic.go:334] "Generic (PLEG): container finished" podID="44a6a678-07bb-4a42-bc3a-da653fe9b529" containerID="ba1d09e2840f436d297d949dd8a774e31d8df42d079d1da6d194099adfad27c4" exitCode=0 Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.652411 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" event={"ID":"44a6a678-07bb-4a42-bc3a-da653fe9b529","Type":"ContainerDied","Data":"ba1d09e2840f436d297d949dd8a774e31d8df42d079d1da6d194099adfad27c4"} Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.652473 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" event={"ID":"44a6a678-07bb-4a42-bc3a-da653fe9b529","Type":"ContainerDied","Data":"94bcf218a40fa5ce2595e4eced2eee6f426067930de6491c60eb80a01103b3cd"} Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.652527 4936 scope.go:117] "RemoveContainer" containerID="ba1d09e2840f436d297d949dd8a774e31d8df42d079d1da6d194099adfad27c4" Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.652585 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-dlshd" Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.660877 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44a6a678-07bb-4a42-bc3a-da653fe9b529-kube-api-access-crb5p" (OuterVolumeSpecName: "kube-api-access-crb5p") pod "44a6a678-07bb-4a42-bc3a-da653fe9b529" (UID: "44a6a678-07bb-4a42-bc3a-da653fe9b529"). InnerVolumeSpecName "kube-api-access-crb5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.706982 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "44a6a678-07bb-4a42-bc3a-da653fe9b529" (UID: "44a6a678-07bb-4a42-bc3a-da653fe9b529"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.741963 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "44a6a678-07bb-4a42-bc3a-da653fe9b529" (UID: "44a6a678-07bb-4a42-bc3a-da653fe9b529"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.749374 4936 scope.go:117] "RemoveContainer" containerID="6b740ed19831e19bce12edd26193e1c11ee7e56755429a0d55031db472c56a89" Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.755169 4936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.755200 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crb5p\" (UniqueName: \"kubernetes.io/projected/44a6a678-07bb-4a42-bc3a-da653fe9b529-kube-api-access-crb5p\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.755212 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.764352 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "44a6a678-07bb-4a42-bc3a-da653fe9b529" (UID: "44a6a678-07bb-4a42-bc3a-da653fe9b529"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.782252 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "44a6a678-07bb-4a42-bc3a-da653fe9b529" (UID: "44a6a678-07bb-4a42-bc3a-da653fe9b529"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.782922 4936 scope.go:117] "RemoveContainer" containerID="ba1d09e2840f436d297d949dd8a774e31d8df42d079d1da6d194099adfad27c4" Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.785932 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-config" (OuterVolumeSpecName: "config") pod "44a6a678-07bb-4a42-bc3a-da653fe9b529" (UID: "44a6a678-07bb-4a42-bc3a-da653fe9b529"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:01:37 crc kubenswrapper[4936]: E0930 14:01:37.786254 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba1d09e2840f436d297d949dd8a774e31d8df42d079d1da6d194099adfad27c4\": container with ID starting with ba1d09e2840f436d297d949dd8a774e31d8df42d079d1da6d194099adfad27c4 not found: ID does not exist" containerID="ba1d09e2840f436d297d949dd8a774e31d8df42d079d1da6d194099adfad27c4" Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.786304 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba1d09e2840f436d297d949dd8a774e31d8df42d079d1da6d194099adfad27c4"} err="failed to get container status \"ba1d09e2840f436d297d949dd8a774e31d8df42d079d1da6d194099adfad27c4\": rpc error: code = NotFound desc = could not find container \"ba1d09e2840f436d297d949dd8a774e31d8df42d079d1da6d194099adfad27c4\": container with ID starting with ba1d09e2840f436d297d949dd8a774e31d8df42d079d1da6d194099adfad27c4 not found: ID does not exist" Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.786355 4936 scope.go:117] "RemoveContainer" containerID="6b740ed19831e19bce12edd26193e1c11ee7e56755429a0d55031db472c56a89" Sep 30 14:01:37 crc kubenswrapper[4936]: E0930 14:01:37.788089 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b740ed19831e19bce12edd26193e1c11ee7e56755429a0d55031db472c56a89\": container with ID starting with 6b740ed19831e19bce12edd26193e1c11ee7e56755429a0d55031db472c56a89 not found: ID does not exist" containerID="6b740ed19831e19bce12edd26193e1c11ee7e56755429a0d55031db472c56a89" Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.788122 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b740ed19831e19bce12edd26193e1c11ee7e56755429a0d55031db472c56a89"} err="failed to get container status \"6b740ed19831e19bce12edd26193e1c11ee7e56755429a0d55031db472c56a89\": rpc error: code = NotFound desc = could not find container \"6b740ed19831e19bce12edd26193e1c11ee7e56755429a0d55031db472c56a89\": container with ID starting with 6b740ed19831e19bce12edd26193e1c11ee7e56755429a0d55031db472c56a89 not found: ID does not exist" Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.856781 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.856821 4936 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.856838 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44a6a678-07bb-4a42-bc3a-da653fe9b529-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.984185 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-dlshd"] Sep 30 14:01:37 crc kubenswrapper[4936]: I0930 14:01:37.991169 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-dlshd"] Sep 30 14:01:38 crc kubenswrapper[4936]: I0930 14:01:38.332866 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44a6a678-07bb-4a42-bc3a-da653fe9b529" path="/var/lib/kubelet/pods/44a6a678-07bb-4a42-bc3a-da653fe9b529/volumes" Sep 30 14:01:42 crc kubenswrapper[4936]: E0930 14:01:42.930304 4936 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98c83542_c990_48c0_8113_17bd96de3cc0.slice/crio-9fe64979b53c159f6a00c90b57a10131a14059975709a0e6c31ca9b79c717e0b.scope\": RecentStats: unable to find data in memory cache]" Sep 30 14:01:43 crc kubenswrapper[4936]: I0930 14:01:43.698575 4936 generic.go:334] "Generic (PLEG): container finished" podID="98c83542-c990-48c0-8113-17bd96de3cc0" containerID="9fe64979b53c159f6a00c90b57a10131a14059975709a0e6c31ca9b79c717e0b" exitCode=0 Sep 30 14:01:43 crc kubenswrapper[4936]: I0930 14:01:43.698637 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" event={"ID":"98c83542-c990-48c0-8113-17bd96de3cc0","Type":"ContainerDied","Data":"9fe64979b53c159f6a00c90b57a10131a14059975709a0e6c31ca9b79c717e0b"} Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.123518 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.274664 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98c83542-c990-48c0-8113-17bd96de3cc0-ssh-key\") pod \"98c83542-c990-48c0-8113-17bd96de3cc0\" (UID: \"98c83542-c990-48c0-8113-17bd96de3cc0\") " Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.274787 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc7dn\" (UniqueName: \"kubernetes.io/projected/98c83542-c990-48c0-8113-17bd96de3cc0-kube-api-access-dc7dn\") pod \"98c83542-c990-48c0-8113-17bd96de3cc0\" (UID: \"98c83542-c990-48c0-8113-17bd96de3cc0\") " Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.274879 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98c83542-c990-48c0-8113-17bd96de3cc0-inventory\") pod \"98c83542-c990-48c0-8113-17bd96de3cc0\" (UID: \"98c83542-c990-48c0-8113-17bd96de3cc0\") " Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.274921 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c83542-c990-48c0-8113-17bd96de3cc0-repo-setup-combined-ca-bundle\") pod \"98c83542-c990-48c0-8113-17bd96de3cc0\" (UID: \"98c83542-c990-48c0-8113-17bd96de3cc0\") " Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.280110 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c83542-c990-48c0-8113-17bd96de3cc0-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "98c83542-c990-48c0-8113-17bd96de3cc0" (UID: "98c83542-c990-48c0-8113-17bd96de3cc0"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.289683 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c83542-c990-48c0-8113-17bd96de3cc0-kube-api-access-dc7dn" (OuterVolumeSpecName: "kube-api-access-dc7dn") pod "98c83542-c990-48c0-8113-17bd96de3cc0" (UID: "98c83542-c990-48c0-8113-17bd96de3cc0"). InnerVolumeSpecName "kube-api-access-dc7dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.303258 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c83542-c990-48c0-8113-17bd96de3cc0-inventory" (OuterVolumeSpecName: "inventory") pod "98c83542-c990-48c0-8113-17bd96de3cc0" (UID: "98c83542-c990-48c0-8113-17bd96de3cc0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.303754 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c83542-c990-48c0-8113-17bd96de3cc0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "98c83542-c990-48c0-8113-17bd96de3cc0" (UID: "98c83542-c990-48c0-8113-17bd96de3cc0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.377050 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc7dn\" (UniqueName: \"kubernetes.io/projected/98c83542-c990-48c0-8113-17bd96de3cc0-kube-api-access-dc7dn\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.377077 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98c83542-c990-48c0-8113-17bd96de3cc0-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.377088 4936 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c83542-c990-48c0-8113-17bd96de3cc0-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.377100 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98c83542-c990-48c0-8113-17bd96de3cc0-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.716799 4936 generic.go:334] "Generic (PLEG): container finished" podID="ba18f440-0c9a-45d0-a1de-9f363bc654cf" containerID="ed2d21a1916fab3730fbafb260f62f625d75eed589a6210b29bcf5dc5e17f2a0" exitCode=0 Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.716865 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ba18f440-0c9a-45d0-a1de-9f363bc654cf","Type":"ContainerDied","Data":"ed2d21a1916fab3730fbafb260f62f625d75eed589a6210b29bcf5dc5e17f2a0"} Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.724454 4936 generic.go:334] "Generic (PLEG): container finished" podID="fd17158a-07d3-477e-8aa6-d03c3cb277c8" containerID="06d5249811691e1104dc2b1046431c9a31bdc5b8456e117afbe9c633d61a4fac" exitCode=0 Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.724526 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fd17158a-07d3-477e-8aa6-d03c3cb277c8","Type":"ContainerDied","Data":"06d5249811691e1104dc2b1046431c9a31bdc5b8456e117afbe9c633d61a4fac"} Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.731776 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" event={"ID":"98c83542-c990-48c0-8113-17bd96de3cc0","Type":"ContainerDied","Data":"67732c3e1660aae64905e052214ea890d1a3a080302db7fb8073ff430ff2cb63"} Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.731822 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67732c3e1660aae64905e052214ea890d1a3a080302db7fb8073ff430ff2cb63" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.731884 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.896407 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm"] Sep 30 14:01:45 crc kubenswrapper[4936]: E0930 14:01:45.897192 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a6a678-07bb-4a42-bc3a-da653fe9b529" containerName="dnsmasq-dns" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.897209 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a6a678-07bb-4a42-bc3a-da653fe9b529" containerName="dnsmasq-dns" Sep 30 14:01:45 crc kubenswrapper[4936]: E0930 14:01:45.897230 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c83542-c990-48c0-8113-17bd96de3cc0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.897242 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c83542-c990-48c0-8113-17bd96de3cc0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 14:01:45 crc kubenswrapper[4936]: E0930 14:01:45.897255 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9963394e-63e7-402f-8c8c-0a4a9b1f0e9e" containerName="init" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.897264 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9963394e-63e7-402f-8c8c-0a4a9b1f0e9e" containerName="init" Sep 30 14:01:45 crc kubenswrapper[4936]: E0930 14:01:45.897281 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9963394e-63e7-402f-8c8c-0a4a9b1f0e9e" containerName="dnsmasq-dns" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.897290 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9963394e-63e7-402f-8c8c-0a4a9b1f0e9e" containerName="dnsmasq-dns" Sep 30 14:01:45 crc kubenswrapper[4936]: E0930 14:01:45.897330 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a6a678-07bb-4a42-bc3a-da653fe9b529" containerName="init" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.897359 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a6a678-07bb-4a42-bc3a-da653fe9b529" containerName="init" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.897567 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a6a678-07bb-4a42-bc3a-da653fe9b529" containerName="dnsmasq-dns" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.897594 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="9963394e-63e7-402f-8c8c-0a4a9b1f0e9e" containerName="dnsmasq-dns" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.897607 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c83542-c990-48c0-8113-17bd96de3cc0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.898536 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.902527 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.902635 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.902730 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.904052 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm"] Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.905169 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.989146 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0aaaa50a-6929-4654-b42b-ccfcd712d106-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm\" (UID: \"0aaaa50a-6929-4654-b42b-ccfcd712d106\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.989208 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0aaaa50a-6929-4654-b42b-ccfcd712d106-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm\" (UID: \"0aaaa50a-6929-4654-b42b-ccfcd712d106\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.989241 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aaaa50a-6929-4654-b42b-ccfcd712d106-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm\" (UID: \"0aaaa50a-6929-4654-b42b-ccfcd712d106\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" Sep 30 14:01:45 crc kubenswrapper[4936]: I0930 14:01:45.989302 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9qn7\" (UniqueName: \"kubernetes.io/projected/0aaaa50a-6929-4654-b42b-ccfcd712d106-kube-api-access-n9qn7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm\" (UID: \"0aaaa50a-6929-4654-b42b-ccfcd712d106\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" Sep 30 14:01:46 crc kubenswrapper[4936]: I0930 14:01:46.091121 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9qn7\" (UniqueName: \"kubernetes.io/projected/0aaaa50a-6929-4654-b42b-ccfcd712d106-kube-api-access-n9qn7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm\" (UID: \"0aaaa50a-6929-4654-b42b-ccfcd712d106\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" Sep 30 14:01:46 crc kubenswrapper[4936]: I0930 14:01:46.091270 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0aaaa50a-6929-4654-b42b-ccfcd712d106-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm\" (UID: \"0aaaa50a-6929-4654-b42b-ccfcd712d106\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" Sep 30 14:01:46 crc kubenswrapper[4936]: I0930 14:01:46.091303 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0aaaa50a-6929-4654-b42b-ccfcd712d106-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm\" (UID: \"0aaaa50a-6929-4654-b42b-ccfcd712d106\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" Sep 30 14:01:46 crc kubenswrapper[4936]: I0930 14:01:46.091349 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aaaa50a-6929-4654-b42b-ccfcd712d106-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm\" (UID: \"0aaaa50a-6929-4654-b42b-ccfcd712d106\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" Sep 30 14:01:46 crc kubenswrapper[4936]: I0930 14:01:46.095778 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0aaaa50a-6929-4654-b42b-ccfcd712d106-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm\" (UID: \"0aaaa50a-6929-4654-b42b-ccfcd712d106\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" Sep 30 14:01:46 crc kubenswrapper[4936]: I0930 14:01:46.096025 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0aaaa50a-6929-4654-b42b-ccfcd712d106-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm\" (UID: \"0aaaa50a-6929-4654-b42b-ccfcd712d106\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" Sep 30 14:01:46 crc kubenswrapper[4936]: I0930 14:01:46.097952 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aaaa50a-6929-4654-b42b-ccfcd712d106-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm\" (UID: \"0aaaa50a-6929-4654-b42b-ccfcd712d106\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" Sep 30 14:01:46 crc kubenswrapper[4936]: I0930 14:01:46.112094 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9qn7\" (UniqueName: \"kubernetes.io/projected/0aaaa50a-6929-4654-b42b-ccfcd712d106-kube-api-access-n9qn7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm\" (UID: \"0aaaa50a-6929-4654-b42b-ccfcd712d106\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" Sep 30 14:01:46 crc kubenswrapper[4936]: I0930 14:01:46.232997 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" Sep 30 14:01:46 crc kubenswrapper[4936]: I0930 14:01:46.742377 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ba18f440-0c9a-45d0-a1de-9f363bc654cf","Type":"ContainerStarted","Data":"09c463e817e6e0f79c31977c05600e4a43c4626f9deac1168ab0e2b1aa6caebb"} Sep 30 14:01:46 crc kubenswrapper[4936]: I0930 14:01:46.743121 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:01:46 crc kubenswrapper[4936]: I0930 14:01:46.744818 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fd17158a-07d3-477e-8aa6-d03c3cb277c8","Type":"ContainerStarted","Data":"37a9ba6b9bf1aebb09faff84c3bb8a2847c5f59b035c9df0458b83f2ce61d803"} Sep 30 14:01:46 crc kubenswrapper[4936]: I0930 14:01:46.745068 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 14:01:46 crc kubenswrapper[4936]: I0930 14:01:46.806287 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=32.806266074 podStartE2EDuration="32.806266074s" podCreationTimestamp="2025-09-30 14:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:01:46.768482772 +0000 UTC m=+1357.152485083" watchObservedRunningTime="2025-09-30 14:01:46.806266074 +0000 UTC m=+1357.190268375" Sep 30 14:01:46 crc kubenswrapper[4936]: W0930 14:01:46.814370 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0aaaa50a_6929_4654_b42b_ccfcd712d106.slice/crio-0c4f4cf4e4311790cca4e334a5c70bff789c1faa607b88066cea2400689d5f3c WatchSource:0}: Error finding container 0c4f4cf4e4311790cca4e334a5c70bff789c1faa607b88066cea2400689d5f3c: Status 404 returned error can't find the container with id 0c4f4cf4e4311790cca4e334a5c70bff789c1faa607b88066cea2400689d5f3c Sep 30 14:01:46 crc kubenswrapper[4936]: I0930 14:01:46.819034 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm"] Sep 30 14:01:46 crc kubenswrapper[4936]: I0930 14:01:46.820604 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=33.820583085 podStartE2EDuration="33.820583085s" podCreationTimestamp="2025-09-30 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:01:46.805724699 +0000 UTC m=+1357.189727000" watchObservedRunningTime="2025-09-30 14:01:46.820583085 +0000 UTC m=+1357.204585416" Sep 30 14:01:47 crc kubenswrapper[4936]: I0930 14:01:47.753608 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" event={"ID":"0aaaa50a-6929-4654-b42b-ccfcd712d106","Type":"ContainerStarted","Data":"0c4f4cf4e4311790cca4e334a5c70bff789c1faa607b88066cea2400689d5f3c"} Sep 30 14:01:48 crc kubenswrapper[4936]: I0930 14:01:48.250326 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:01:48 crc kubenswrapper[4936]: I0930 14:01:48.250406 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:01:48 crc kubenswrapper[4936]: I0930 14:01:48.250455 4936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 14:01:48 crc kubenswrapper[4936]: I0930 14:01:48.251159 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9925ed581c77513fa67110fdb500bc3893f95ddcc97c621140a1a3e57e9f5628"} pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:01:48 crc kubenswrapper[4936]: I0930 14:01:48.251216 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" containerID="cri-o://9925ed581c77513fa67110fdb500bc3893f95ddcc97c621140a1a3e57e9f5628" gracePeriod=600 Sep 30 14:01:48 crc kubenswrapper[4936]: I0930 14:01:48.763537 4936 generic.go:334] "Generic (PLEG): container finished" podID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerID="9925ed581c77513fa67110fdb500bc3893f95ddcc97c621140a1a3e57e9f5628" exitCode=0 Sep 30 14:01:48 crc kubenswrapper[4936]: I0930 14:01:48.763611 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerDied","Data":"9925ed581c77513fa67110fdb500bc3893f95ddcc97c621140a1a3e57e9f5628"} Sep 30 14:01:48 crc kubenswrapper[4936]: I0930 14:01:48.764071 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707"} Sep 30 14:01:48 crc kubenswrapper[4936]: I0930 14:01:48.764090 4936 scope.go:117] "RemoveContainer" containerID="17d93f7a347eff1cb03a59a1226bb2a542917483154320d58d4c72a501cddc95" Sep 30 14:01:49 crc kubenswrapper[4936]: I0930 14:01:49.793011 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" event={"ID":"0aaaa50a-6929-4654-b42b-ccfcd712d106","Type":"ContainerStarted","Data":"0dff08732339e1b583d0b75d3601c0b458cb041aa1b0d04690b3d23b4928983c"} Sep 30 14:01:49 crc kubenswrapper[4936]: I0930 14:01:49.813032 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" podStartSLOduration=2.737916032 podStartE2EDuration="4.813013532s" podCreationTimestamp="2025-09-30 14:01:45 +0000 UTC" firstStartedPulling="2025-09-30 14:01:46.817392738 +0000 UTC m=+1357.201395039" lastFinishedPulling="2025-09-30 14:01:48.892490238 +0000 UTC m=+1359.276492539" observedRunningTime="2025-09-30 14:01:49.812365304 +0000 UTC m=+1360.196367605" watchObservedRunningTime="2025-09-30 14:01:49.813013532 +0000 UTC m=+1360.197015833" Sep 30 14:01:53 crc kubenswrapper[4936]: I0930 14:01:53.271520 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jbbw7"] Sep 30 14:01:53 crc kubenswrapper[4936]: I0930 14:01:53.273744 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jbbw7" Sep 30 14:01:53 crc kubenswrapper[4936]: I0930 14:01:53.303446 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jbbw7"] Sep 30 14:01:53 crc kubenswrapper[4936]: I0930 14:01:53.448388 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/372dcfae-58dd-4bb6-a769-c8a45c579403-utilities\") pod \"community-operators-jbbw7\" (UID: \"372dcfae-58dd-4bb6-a769-c8a45c579403\") " pod="openshift-marketplace/community-operators-jbbw7" Sep 30 14:01:53 crc kubenswrapper[4936]: I0930 14:01:53.448771 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dbr2\" (UniqueName: \"kubernetes.io/projected/372dcfae-58dd-4bb6-a769-c8a45c579403-kube-api-access-9dbr2\") pod \"community-operators-jbbw7\" (UID: \"372dcfae-58dd-4bb6-a769-c8a45c579403\") " pod="openshift-marketplace/community-operators-jbbw7" Sep 30 14:01:53 crc kubenswrapper[4936]: I0930 14:01:53.448897 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/372dcfae-58dd-4bb6-a769-c8a45c579403-catalog-content\") pod \"community-operators-jbbw7\" (UID: \"372dcfae-58dd-4bb6-a769-c8a45c579403\") " pod="openshift-marketplace/community-operators-jbbw7" Sep 30 14:01:53 crc kubenswrapper[4936]: I0930 14:01:53.550971 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dbr2\" (UniqueName: \"kubernetes.io/projected/372dcfae-58dd-4bb6-a769-c8a45c579403-kube-api-access-9dbr2\") pod \"community-operators-jbbw7\" (UID: \"372dcfae-58dd-4bb6-a769-c8a45c579403\") " pod="openshift-marketplace/community-operators-jbbw7" Sep 30 14:01:53 crc kubenswrapper[4936]: I0930 14:01:53.551041 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/372dcfae-58dd-4bb6-a769-c8a45c579403-catalog-content\") pod \"community-operators-jbbw7\" (UID: \"372dcfae-58dd-4bb6-a769-c8a45c579403\") " pod="openshift-marketplace/community-operators-jbbw7" Sep 30 14:01:53 crc kubenswrapper[4936]: I0930 14:01:53.551094 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/372dcfae-58dd-4bb6-a769-c8a45c579403-utilities\") pod \"community-operators-jbbw7\" (UID: \"372dcfae-58dd-4bb6-a769-c8a45c579403\") " pod="openshift-marketplace/community-operators-jbbw7" Sep 30 14:01:53 crc kubenswrapper[4936]: I0930 14:01:53.551531 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/372dcfae-58dd-4bb6-a769-c8a45c579403-utilities\") pod \"community-operators-jbbw7\" (UID: \"372dcfae-58dd-4bb6-a769-c8a45c579403\") " pod="openshift-marketplace/community-operators-jbbw7" Sep 30 14:01:53 crc kubenswrapper[4936]: I0930 14:01:53.551634 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/372dcfae-58dd-4bb6-a769-c8a45c579403-catalog-content\") pod \"community-operators-jbbw7\" (UID: \"372dcfae-58dd-4bb6-a769-c8a45c579403\") " pod="openshift-marketplace/community-operators-jbbw7" Sep 30 14:01:53 crc kubenswrapper[4936]: I0930 14:01:53.585293 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dbr2\" (UniqueName: \"kubernetes.io/projected/372dcfae-58dd-4bb6-a769-c8a45c579403-kube-api-access-9dbr2\") pod \"community-operators-jbbw7\" (UID: \"372dcfae-58dd-4bb6-a769-c8a45c579403\") " pod="openshift-marketplace/community-operators-jbbw7" Sep 30 14:01:53 crc kubenswrapper[4936]: I0930 14:01:53.593027 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jbbw7" Sep 30 14:01:54 crc kubenswrapper[4936]: I0930 14:01:54.162166 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jbbw7"] Sep 30 14:01:54 crc kubenswrapper[4936]: I0930 14:01:54.837646 4936 generic.go:334] "Generic (PLEG): container finished" podID="372dcfae-58dd-4bb6-a769-c8a45c579403" containerID="778cde8effbe20c693a576f25a8d3b6bf176d781d0f9884f28c61d37241b7dd5" exitCode=0 Sep 30 14:01:54 crc kubenswrapper[4936]: I0930 14:01:54.837693 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbbw7" event={"ID":"372dcfae-58dd-4bb6-a769-c8a45c579403","Type":"ContainerDied","Data":"778cde8effbe20c693a576f25a8d3b6bf176d781d0f9884f28c61d37241b7dd5"} Sep 30 14:01:54 crc kubenswrapper[4936]: I0930 14:01:54.839087 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbbw7" event={"ID":"372dcfae-58dd-4bb6-a769-c8a45c579403","Type":"ContainerStarted","Data":"c762e732a707d2806c945536fa362cddc943a8c80ebe010c69b465bafe8b921d"} Sep 30 14:01:56 crc kubenswrapper[4936]: I0930 14:01:56.858555 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbbw7" event={"ID":"372dcfae-58dd-4bb6-a769-c8a45c579403","Type":"ContainerStarted","Data":"016631fd179f509e9b838d7ddfaa856a40f1a21abfd79d725b5e72ea9f1c2f19"} Sep 30 14:01:57 crc kubenswrapper[4936]: I0930 14:01:57.868887 4936 generic.go:334] "Generic (PLEG): container finished" podID="372dcfae-58dd-4bb6-a769-c8a45c579403" containerID="016631fd179f509e9b838d7ddfaa856a40f1a21abfd79d725b5e72ea9f1c2f19" exitCode=0 Sep 30 14:01:57 crc kubenswrapper[4936]: I0930 14:01:57.868926 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbbw7" event={"ID":"372dcfae-58dd-4bb6-a769-c8a45c579403","Type":"ContainerDied","Data":"016631fd179f509e9b838d7ddfaa856a40f1a21abfd79d725b5e72ea9f1c2f19"} Sep 30 14:01:58 crc kubenswrapper[4936]: I0930 14:01:58.885953 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbbw7" event={"ID":"372dcfae-58dd-4bb6-a769-c8a45c579403","Type":"ContainerStarted","Data":"6f6d7a07162b56aef77d760e9092a75a11b923534a634225dc3908a4d31cba0d"} Sep 30 14:01:58 crc kubenswrapper[4936]: I0930 14:01:58.909005 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jbbw7" podStartSLOduration=2.386185598 podStartE2EDuration="5.90898493s" podCreationTimestamp="2025-09-30 14:01:53 +0000 UTC" firstStartedPulling="2025-09-30 14:01:54.840883887 +0000 UTC m=+1365.224886188" lastFinishedPulling="2025-09-30 14:01:58.363683219 +0000 UTC m=+1368.747685520" observedRunningTime="2025-09-30 14:01:58.903406316 +0000 UTC m=+1369.287408617" watchObservedRunningTime="2025-09-30 14:01:58.90898493 +0000 UTC m=+1369.292987231" Sep 30 14:02:03 crc kubenswrapper[4936]: I0930 14:02:03.593643 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jbbw7" Sep 30 14:02:03 crc kubenswrapper[4936]: I0930 14:02:03.594169 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jbbw7" Sep 30 14:02:03 crc kubenswrapper[4936]: I0930 14:02:03.643124 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jbbw7" Sep 30 14:02:03 crc kubenswrapper[4936]: I0930 14:02:03.970894 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jbbw7" Sep 30 14:02:04 crc kubenswrapper[4936]: I0930 14:02:04.034625 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jbbw7"] Sep 30 14:02:04 crc kubenswrapper[4936]: I0930 14:02:04.057594 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 14:02:04 crc kubenswrapper[4936]: I0930 14:02:04.378509 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:02:05 crc kubenswrapper[4936]: I0930 14:02:05.939882 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jbbw7" podUID="372dcfae-58dd-4bb6-a769-c8a45c579403" containerName="registry-server" containerID="cri-o://6f6d7a07162b56aef77d760e9092a75a11b923534a634225dc3908a4d31cba0d" gracePeriod=2 Sep 30 14:02:06 crc kubenswrapper[4936]: I0930 14:02:06.400834 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jbbw7" Sep 30 14:02:06 crc kubenswrapper[4936]: I0930 14:02:06.560414 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dbr2\" (UniqueName: \"kubernetes.io/projected/372dcfae-58dd-4bb6-a769-c8a45c579403-kube-api-access-9dbr2\") pod \"372dcfae-58dd-4bb6-a769-c8a45c579403\" (UID: \"372dcfae-58dd-4bb6-a769-c8a45c579403\") " Sep 30 14:02:06 crc kubenswrapper[4936]: I0930 14:02:06.560486 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/372dcfae-58dd-4bb6-a769-c8a45c579403-catalog-content\") pod \"372dcfae-58dd-4bb6-a769-c8a45c579403\" (UID: \"372dcfae-58dd-4bb6-a769-c8a45c579403\") " Sep 30 14:02:06 crc kubenswrapper[4936]: I0930 14:02:06.560618 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/372dcfae-58dd-4bb6-a769-c8a45c579403-utilities\") pod \"372dcfae-58dd-4bb6-a769-c8a45c579403\" (UID: \"372dcfae-58dd-4bb6-a769-c8a45c579403\") " Sep 30 14:02:06 crc kubenswrapper[4936]: I0930 14:02:06.561552 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/372dcfae-58dd-4bb6-a769-c8a45c579403-utilities" (OuterVolumeSpecName: "utilities") pod "372dcfae-58dd-4bb6-a769-c8a45c579403" (UID: "372dcfae-58dd-4bb6-a769-c8a45c579403"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:02:06 crc kubenswrapper[4936]: I0930 14:02:06.573767 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/372dcfae-58dd-4bb6-a769-c8a45c579403-kube-api-access-9dbr2" (OuterVolumeSpecName: "kube-api-access-9dbr2") pod "372dcfae-58dd-4bb6-a769-c8a45c579403" (UID: "372dcfae-58dd-4bb6-a769-c8a45c579403"). InnerVolumeSpecName "kube-api-access-9dbr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:02:06 crc kubenswrapper[4936]: I0930 14:02:06.618828 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/372dcfae-58dd-4bb6-a769-c8a45c579403-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "372dcfae-58dd-4bb6-a769-c8a45c579403" (UID: "372dcfae-58dd-4bb6-a769-c8a45c579403"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:02:06 crc kubenswrapper[4936]: I0930 14:02:06.662386 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/372dcfae-58dd-4bb6-a769-c8a45c579403-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:06 crc kubenswrapper[4936]: I0930 14:02:06.662616 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dbr2\" (UniqueName: \"kubernetes.io/projected/372dcfae-58dd-4bb6-a769-c8a45c579403-kube-api-access-9dbr2\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:06 crc kubenswrapper[4936]: I0930 14:02:06.662683 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/372dcfae-58dd-4bb6-a769-c8a45c579403-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:06 crc kubenswrapper[4936]: I0930 14:02:06.951056 4936 generic.go:334] "Generic (PLEG): container finished" podID="372dcfae-58dd-4bb6-a769-c8a45c579403" containerID="6f6d7a07162b56aef77d760e9092a75a11b923534a634225dc3908a4d31cba0d" exitCode=0 Sep 30 14:02:06 crc kubenswrapper[4936]: I0930 14:02:06.951130 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jbbw7" Sep 30 14:02:06 crc kubenswrapper[4936]: I0930 14:02:06.951117 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbbw7" event={"ID":"372dcfae-58dd-4bb6-a769-c8a45c579403","Type":"ContainerDied","Data":"6f6d7a07162b56aef77d760e9092a75a11b923534a634225dc3908a4d31cba0d"} Sep 30 14:02:06 crc kubenswrapper[4936]: I0930 14:02:06.951267 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbbw7" event={"ID":"372dcfae-58dd-4bb6-a769-c8a45c579403","Type":"ContainerDied","Data":"c762e732a707d2806c945536fa362cddc943a8c80ebe010c69b465bafe8b921d"} Sep 30 14:02:06 crc kubenswrapper[4936]: I0930 14:02:06.951291 4936 scope.go:117] "RemoveContainer" containerID="6f6d7a07162b56aef77d760e9092a75a11b923534a634225dc3908a4d31cba0d" Sep 30 14:02:06 crc kubenswrapper[4936]: I0930 14:02:06.970604 4936 scope.go:117] "RemoveContainer" containerID="016631fd179f509e9b838d7ddfaa856a40f1a21abfd79d725b5e72ea9f1c2f19" Sep 30 14:02:06 crc kubenswrapper[4936]: I0930 14:02:06.996718 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jbbw7"] Sep 30 14:02:07 crc kubenswrapper[4936]: I0930 14:02:07.004071 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jbbw7"] Sep 30 14:02:07 crc kubenswrapper[4936]: I0930 14:02:07.011139 4936 scope.go:117] "RemoveContainer" containerID="778cde8effbe20c693a576f25a8d3b6bf176d781d0f9884f28c61d37241b7dd5" Sep 30 14:02:07 crc kubenswrapper[4936]: I0930 14:02:07.046665 4936 scope.go:117] "RemoveContainer" containerID="6f6d7a07162b56aef77d760e9092a75a11b923534a634225dc3908a4d31cba0d" Sep 30 14:02:07 crc kubenswrapper[4936]: E0930 14:02:07.047190 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f6d7a07162b56aef77d760e9092a75a11b923534a634225dc3908a4d31cba0d\": container with ID starting with 6f6d7a07162b56aef77d760e9092a75a11b923534a634225dc3908a4d31cba0d not found: ID does not exist" containerID="6f6d7a07162b56aef77d760e9092a75a11b923534a634225dc3908a4d31cba0d" Sep 30 14:02:07 crc kubenswrapper[4936]: I0930 14:02:07.047317 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f6d7a07162b56aef77d760e9092a75a11b923534a634225dc3908a4d31cba0d"} err="failed to get container status \"6f6d7a07162b56aef77d760e9092a75a11b923534a634225dc3908a4d31cba0d\": rpc error: code = NotFound desc = could not find container \"6f6d7a07162b56aef77d760e9092a75a11b923534a634225dc3908a4d31cba0d\": container with ID starting with 6f6d7a07162b56aef77d760e9092a75a11b923534a634225dc3908a4d31cba0d not found: ID does not exist" Sep 30 14:02:07 crc kubenswrapper[4936]: I0930 14:02:07.047422 4936 scope.go:117] "RemoveContainer" containerID="016631fd179f509e9b838d7ddfaa856a40f1a21abfd79d725b5e72ea9f1c2f19" Sep 30 14:02:07 crc kubenswrapper[4936]: E0930 14:02:07.047723 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"016631fd179f509e9b838d7ddfaa856a40f1a21abfd79d725b5e72ea9f1c2f19\": container with ID starting with 016631fd179f509e9b838d7ddfaa856a40f1a21abfd79d725b5e72ea9f1c2f19 not found: ID does not exist" containerID="016631fd179f509e9b838d7ddfaa856a40f1a21abfd79d725b5e72ea9f1c2f19" Sep 30 14:02:07 crc kubenswrapper[4936]: I0930 14:02:07.047751 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"016631fd179f509e9b838d7ddfaa856a40f1a21abfd79d725b5e72ea9f1c2f19"} err="failed to get container status \"016631fd179f509e9b838d7ddfaa856a40f1a21abfd79d725b5e72ea9f1c2f19\": rpc error: code = NotFound desc = could not find container \"016631fd179f509e9b838d7ddfaa856a40f1a21abfd79d725b5e72ea9f1c2f19\": container with ID starting with 016631fd179f509e9b838d7ddfaa856a40f1a21abfd79d725b5e72ea9f1c2f19 not found: ID does not exist" Sep 30 14:02:07 crc kubenswrapper[4936]: I0930 14:02:07.047769 4936 scope.go:117] "RemoveContainer" containerID="778cde8effbe20c693a576f25a8d3b6bf176d781d0f9884f28c61d37241b7dd5" Sep 30 14:02:07 crc kubenswrapper[4936]: E0930 14:02:07.048059 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"778cde8effbe20c693a576f25a8d3b6bf176d781d0f9884f28c61d37241b7dd5\": container with ID starting with 778cde8effbe20c693a576f25a8d3b6bf176d781d0f9884f28c61d37241b7dd5 not found: ID does not exist" containerID="778cde8effbe20c693a576f25a8d3b6bf176d781d0f9884f28c61d37241b7dd5" Sep 30 14:02:07 crc kubenswrapper[4936]: I0930 14:02:07.048175 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"778cde8effbe20c693a576f25a8d3b6bf176d781d0f9884f28c61d37241b7dd5"} err="failed to get container status \"778cde8effbe20c693a576f25a8d3b6bf176d781d0f9884f28c61d37241b7dd5\": rpc error: code = NotFound desc = could not find container \"778cde8effbe20c693a576f25a8d3b6bf176d781d0f9884f28c61d37241b7dd5\": container with ID starting with 778cde8effbe20c693a576f25a8d3b6bf176d781d0f9884f28c61d37241b7dd5 not found: ID does not exist" Sep 30 14:02:08 crc kubenswrapper[4936]: I0930 14:02:08.329151 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="372dcfae-58dd-4bb6-a769-c8a45c579403" path="/var/lib/kubelet/pods/372dcfae-58dd-4bb6-a769-c8a45c579403/volumes" Sep 30 14:02:35 crc kubenswrapper[4936]: I0930 14:02:35.930973 4936 scope.go:117] "RemoveContainer" containerID="fcdb45b1fbb5ca1ed97b759ac903c8b6056e829abc83644d5e679c61a0734034" Sep 30 14:02:35 crc kubenswrapper[4936]: I0930 14:02:35.963418 4936 scope.go:117] "RemoveContainer" containerID="add9150671b34395e2a0c04f67604c4a7430a4b6a780472138adada4476b15e1" Sep 30 14:02:42 crc kubenswrapper[4936]: I0930 14:02:42.271577 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mqbrq"] Sep 30 14:02:42 crc kubenswrapper[4936]: E0930 14:02:42.272486 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="372dcfae-58dd-4bb6-a769-c8a45c579403" containerName="extract-content" Sep 30 14:02:42 crc kubenswrapper[4936]: I0930 14:02:42.272499 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="372dcfae-58dd-4bb6-a769-c8a45c579403" containerName="extract-content" Sep 30 14:02:42 crc kubenswrapper[4936]: E0930 14:02:42.272516 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="372dcfae-58dd-4bb6-a769-c8a45c579403" containerName="extract-utilities" Sep 30 14:02:42 crc kubenswrapper[4936]: I0930 14:02:42.272523 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="372dcfae-58dd-4bb6-a769-c8a45c579403" containerName="extract-utilities" Sep 30 14:02:42 crc kubenswrapper[4936]: E0930 14:02:42.272537 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="372dcfae-58dd-4bb6-a769-c8a45c579403" containerName="registry-server" Sep 30 14:02:42 crc kubenswrapper[4936]: I0930 14:02:42.272543 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="372dcfae-58dd-4bb6-a769-c8a45c579403" containerName="registry-server" Sep 30 14:02:42 crc kubenswrapper[4936]: I0930 14:02:42.272700 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="372dcfae-58dd-4bb6-a769-c8a45c579403" containerName="registry-server" Sep 30 14:02:42 crc kubenswrapper[4936]: I0930 14:02:42.273938 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mqbrq" Sep 30 14:02:42 crc kubenswrapper[4936]: I0930 14:02:42.290956 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mqbrq"] Sep 30 14:02:42 crc kubenswrapper[4936]: I0930 14:02:42.307841 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4zst\" (UniqueName: \"kubernetes.io/projected/404b119c-ac63-4bd5-adfd-022409dd83b2-kube-api-access-s4zst\") pod \"redhat-operators-mqbrq\" (UID: \"404b119c-ac63-4bd5-adfd-022409dd83b2\") " pod="openshift-marketplace/redhat-operators-mqbrq" Sep 30 14:02:42 crc kubenswrapper[4936]: I0930 14:02:42.308122 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/404b119c-ac63-4bd5-adfd-022409dd83b2-utilities\") pod \"redhat-operators-mqbrq\" (UID: \"404b119c-ac63-4bd5-adfd-022409dd83b2\") " pod="openshift-marketplace/redhat-operators-mqbrq" Sep 30 14:02:42 crc kubenswrapper[4936]: I0930 14:02:42.308236 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/404b119c-ac63-4bd5-adfd-022409dd83b2-catalog-content\") pod \"redhat-operators-mqbrq\" (UID: \"404b119c-ac63-4bd5-adfd-022409dd83b2\") " pod="openshift-marketplace/redhat-operators-mqbrq" Sep 30 14:02:42 crc kubenswrapper[4936]: I0930 14:02:42.409156 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/404b119c-ac63-4bd5-adfd-022409dd83b2-utilities\") pod \"redhat-operators-mqbrq\" (UID: \"404b119c-ac63-4bd5-adfd-022409dd83b2\") " pod="openshift-marketplace/redhat-operators-mqbrq" Sep 30 14:02:42 crc kubenswrapper[4936]: I0930 14:02:42.409240 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/404b119c-ac63-4bd5-adfd-022409dd83b2-catalog-content\") pod \"redhat-operators-mqbrq\" (UID: \"404b119c-ac63-4bd5-adfd-022409dd83b2\") " pod="openshift-marketplace/redhat-operators-mqbrq" Sep 30 14:02:42 crc kubenswrapper[4936]: I0930 14:02:42.409284 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4zst\" (UniqueName: \"kubernetes.io/projected/404b119c-ac63-4bd5-adfd-022409dd83b2-kube-api-access-s4zst\") pod \"redhat-operators-mqbrq\" (UID: \"404b119c-ac63-4bd5-adfd-022409dd83b2\") " pod="openshift-marketplace/redhat-operators-mqbrq" Sep 30 14:02:42 crc kubenswrapper[4936]: I0930 14:02:42.410190 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/404b119c-ac63-4bd5-adfd-022409dd83b2-utilities\") pod \"redhat-operators-mqbrq\" (UID: \"404b119c-ac63-4bd5-adfd-022409dd83b2\") " pod="openshift-marketplace/redhat-operators-mqbrq" Sep 30 14:02:42 crc kubenswrapper[4936]: I0930 14:02:42.410519 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/404b119c-ac63-4bd5-adfd-022409dd83b2-catalog-content\") pod \"redhat-operators-mqbrq\" (UID: \"404b119c-ac63-4bd5-adfd-022409dd83b2\") " pod="openshift-marketplace/redhat-operators-mqbrq" Sep 30 14:02:42 crc kubenswrapper[4936]: I0930 14:02:42.430485 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4zst\" (UniqueName: \"kubernetes.io/projected/404b119c-ac63-4bd5-adfd-022409dd83b2-kube-api-access-s4zst\") pod \"redhat-operators-mqbrq\" (UID: \"404b119c-ac63-4bd5-adfd-022409dd83b2\") " pod="openshift-marketplace/redhat-operators-mqbrq" Sep 30 14:02:42 crc kubenswrapper[4936]: I0930 14:02:42.598224 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mqbrq" Sep 30 14:02:43 crc kubenswrapper[4936]: I0930 14:02:43.108657 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mqbrq"] Sep 30 14:02:43 crc kubenswrapper[4936]: I0930 14:02:43.249895 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqbrq" event={"ID":"404b119c-ac63-4bd5-adfd-022409dd83b2","Type":"ContainerStarted","Data":"cf88048ea781e5d6e3959b0ec0b584b0191f99ab2b1567dada6c30258d0850af"} Sep 30 14:02:44 crc kubenswrapper[4936]: I0930 14:02:44.259117 4936 generic.go:334] "Generic (PLEG): container finished" podID="404b119c-ac63-4bd5-adfd-022409dd83b2" containerID="457e01f218ea33201e6960e8655bf0ec98241805b95d5c9bc89bc95ea1eceac0" exitCode=0 Sep 30 14:02:44 crc kubenswrapper[4936]: I0930 14:02:44.259478 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqbrq" event={"ID":"404b119c-ac63-4bd5-adfd-022409dd83b2","Type":"ContainerDied","Data":"457e01f218ea33201e6960e8655bf0ec98241805b95d5c9bc89bc95ea1eceac0"} Sep 30 14:02:46 crc kubenswrapper[4936]: I0930 14:02:46.278779 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqbrq" event={"ID":"404b119c-ac63-4bd5-adfd-022409dd83b2","Type":"ContainerStarted","Data":"c3dd484713c79e7a45faea9f35db3355c6bdc6f3e34b915a66d95aa367fb55e7"} Sep 30 14:02:49 crc kubenswrapper[4936]: I0930 14:02:49.309788 4936 generic.go:334] "Generic (PLEG): container finished" podID="404b119c-ac63-4bd5-adfd-022409dd83b2" containerID="c3dd484713c79e7a45faea9f35db3355c6bdc6f3e34b915a66d95aa367fb55e7" exitCode=0 Sep 30 14:02:49 crc kubenswrapper[4936]: I0930 14:02:49.309880 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqbrq" event={"ID":"404b119c-ac63-4bd5-adfd-022409dd83b2","Type":"ContainerDied","Data":"c3dd484713c79e7a45faea9f35db3355c6bdc6f3e34b915a66d95aa367fb55e7"} Sep 30 14:02:50 crc kubenswrapper[4936]: I0930 14:02:50.344214 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqbrq" event={"ID":"404b119c-ac63-4bd5-adfd-022409dd83b2","Type":"ContainerStarted","Data":"712fef47136804eaac09ee099852c57798d2b43926e007921069af60e2b9e549"} Sep 30 14:02:50 crc kubenswrapper[4936]: I0930 14:02:50.373924 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mqbrq" podStartSLOduration=2.849993306 podStartE2EDuration="8.373901345s" podCreationTimestamp="2025-09-30 14:02:42 +0000 UTC" firstStartedPulling="2025-09-30 14:02:44.262188832 +0000 UTC m=+1414.646191133" lastFinishedPulling="2025-09-30 14:02:49.786096871 +0000 UTC m=+1420.170099172" observedRunningTime="2025-09-30 14:02:50.36902516 +0000 UTC m=+1420.753027461" watchObservedRunningTime="2025-09-30 14:02:50.373901345 +0000 UTC m=+1420.757903656" Sep 30 14:02:52 crc kubenswrapper[4936]: I0930 14:02:52.598380 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mqbrq" Sep 30 14:02:52 crc kubenswrapper[4936]: I0930 14:02:52.599762 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mqbrq" Sep 30 14:02:53 crc kubenswrapper[4936]: I0930 14:02:53.668482 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mqbrq" podUID="404b119c-ac63-4bd5-adfd-022409dd83b2" containerName="registry-server" probeResult="failure" output=< Sep 30 14:02:53 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 14:02:53 crc kubenswrapper[4936]: > Sep 30 14:03:03 crc kubenswrapper[4936]: I0930 14:03:03.639228 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mqbrq" podUID="404b119c-ac63-4bd5-adfd-022409dd83b2" containerName="registry-server" probeResult="failure" output=< Sep 30 14:03:03 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 14:03:03 crc kubenswrapper[4936]: > Sep 30 14:03:12 crc kubenswrapper[4936]: I0930 14:03:12.648893 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mqbrq" Sep 30 14:03:12 crc kubenswrapper[4936]: I0930 14:03:12.694426 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mqbrq" Sep 30 14:03:13 crc kubenswrapper[4936]: I0930 14:03:13.477735 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mqbrq"] Sep 30 14:03:14 crc kubenswrapper[4936]: I0930 14:03:14.553246 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mqbrq" podUID="404b119c-ac63-4bd5-adfd-022409dd83b2" containerName="registry-server" containerID="cri-o://712fef47136804eaac09ee099852c57798d2b43926e007921069af60e2b9e549" gracePeriod=2 Sep 30 14:03:14 crc kubenswrapper[4936]: I0930 14:03:14.979854 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mqbrq" Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.129304 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4zst\" (UniqueName: \"kubernetes.io/projected/404b119c-ac63-4bd5-adfd-022409dd83b2-kube-api-access-s4zst\") pod \"404b119c-ac63-4bd5-adfd-022409dd83b2\" (UID: \"404b119c-ac63-4bd5-adfd-022409dd83b2\") " Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.129721 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/404b119c-ac63-4bd5-adfd-022409dd83b2-catalog-content\") pod \"404b119c-ac63-4bd5-adfd-022409dd83b2\" (UID: \"404b119c-ac63-4bd5-adfd-022409dd83b2\") " Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.129762 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/404b119c-ac63-4bd5-adfd-022409dd83b2-utilities\") pod \"404b119c-ac63-4bd5-adfd-022409dd83b2\" (UID: \"404b119c-ac63-4bd5-adfd-022409dd83b2\") " Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.135633 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/404b119c-ac63-4bd5-adfd-022409dd83b2-kube-api-access-s4zst" (OuterVolumeSpecName: "kube-api-access-s4zst") pod "404b119c-ac63-4bd5-adfd-022409dd83b2" (UID: "404b119c-ac63-4bd5-adfd-022409dd83b2"). InnerVolumeSpecName "kube-api-access-s4zst". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.136496 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/404b119c-ac63-4bd5-adfd-022409dd83b2-utilities" (OuterVolumeSpecName: "utilities") pod "404b119c-ac63-4bd5-adfd-022409dd83b2" (UID: "404b119c-ac63-4bd5-adfd-022409dd83b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.220364 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/404b119c-ac63-4bd5-adfd-022409dd83b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "404b119c-ac63-4bd5-adfd-022409dd83b2" (UID: "404b119c-ac63-4bd5-adfd-022409dd83b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.233611 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/404b119c-ac63-4bd5-adfd-022409dd83b2-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.233651 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/404b119c-ac63-4bd5-adfd-022409dd83b2-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.233664 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4zst\" (UniqueName: \"kubernetes.io/projected/404b119c-ac63-4bd5-adfd-022409dd83b2-kube-api-access-s4zst\") on node \"crc\" DevicePath \"\"" Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.564546 4936 generic.go:334] "Generic (PLEG): container finished" podID="404b119c-ac63-4bd5-adfd-022409dd83b2" containerID="712fef47136804eaac09ee099852c57798d2b43926e007921069af60e2b9e549" exitCode=0 Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.564587 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqbrq" event={"ID":"404b119c-ac63-4bd5-adfd-022409dd83b2","Type":"ContainerDied","Data":"712fef47136804eaac09ee099852c57798d2b43926e007921069af60e2b9e549"} Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.564616 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqbrq" event={"ID":"404b119c-ac63-4bd5-adfd-022409dd83b2","Type":"ContainerDied","Data":"cf88048ea781e5d6e3959b0ec0b584b0191f99ab2b1567dada6c30258d0850af"} Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.564640 4936 scope.go:117] "RemoveContainer" containerID="712fef47136804eaac09ee099852c57798d2b43926e007921069af60e2b9e549" Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.564711 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mqbrq" Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.592189 4936 scope.go:117] "RemoveContainer" containerID="c3dd484713c79e7a45faea9f35db3355c6bdc6f3e34b915a66d95aa367fb55e7" Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.607029 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mqbrq"] Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.616427 4936 scope.go:117] "RemoveContainer" containerID="457e01f218ea33201e6960e8655bf0ec98241805b95d5c9bc89bc95ea1eceac0" Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.624883 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mqbrq"] Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.661250 4936 scope.go:117] "RemoveContainer" containerID="712fef47136804eaac09ee099852c57798d2b43926e007921069af60e2b9e549" Sep 30 14:03:15 crc kubenswrapper[4936]: E0930 14:03:15.662322 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"712fef47136804eaac09ee099852c57798d2b43926e007921069af60e2b9e549\": container with ID starting with 712fef47136804eaac09ee099852c57798d2b43926e007921069af60e2b9e549 not found: ID does not exist" containerID="712fef47136804eaac09ee099852c57798d2b43926e007921069af60e2b9e549" Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.662868 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"712fef47136804eaac09ee099852c57798d2b43926e007921069af60e2b9e549"} err="failed to get container status \"712fef47136804eaac09ee099852c57798d2b43926e007921069af60e2b9e549\": rpc error: code = NotFound desc = could not find container \"712fef47136804eaac09ee099852c57798d2b43926e007921069af60e2b9e549\": container with ID starting with 712fef47136804eaac09ee099852c57798d2b43926e007921069af60e2b9e549 not found: ID does not exist" Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.662895 4936 scope.go:117] "RemoveContainer" containerID="c3dd484713c79e7a45faea9f35db3355c6bdc6f3e34b915a66d95aa367fb55e7" Sep 30 14:03:15 crc kubenswrapper[4936]: E0930 14:03:15.663386 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3dd484713c79e7a45faea9f35db3355c6bdc6f3e34b915a66d95aa367fb55e7\": container with ID starting with c3dd484713c79e7a45faea9f35db3355c6bdc6f3e34b915a66d95aa367fb55e7 not found: ID does not exist" containerID="c3dd484713c79e7a45faea9f35db3355c6bdc6f3e34b915a66d95aa367fb55e7" Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.663409 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3dd484713c79e7a45faea9f35db3355c6bdc6f3e34b915a66d95aa367fb55e7"} err="failed to get container status \"c3dd484713c79e7a45faea9f35db3355c6bdc6f3e34b915a66d95aa367fb55e7\": rpc error: code = NotFound desc = could not find container \"c3dd484713c79e7a45faea9f35db3355c6bdc6f3e34b915a66d95aa367fb55e7\": container with ID starting with c3dd484713c79e7a45faea9f35db3355c6bdc6f3e34b915a66d95aa367fb55e7 not found: ID does not exist" Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.663423 4936 scope.go:117] "RemoveContainer" containerID="457e01f218ea33201e6960e8655bf0ec98241805b95d5c9bc89bc95ea1eceac0" Sep 30 14:03:15 crc kubenswrapper[4936]: E0930 14:03:15.663677 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"457e01f218ea33201e6960e8655bf0ec98241805b95d5c9bc89bc95ea1eceac0\": container with ID starting with 457e01f218ea33201e6960e8655bf0ec98241805b95d5c9bc89bc95ea1eceac0 not found: ID does not exist" containerID="457e01f218ea33201e6960e8655bf0ec98241805b95d5c9bc89bc95ea1eceac0" Sep 30 14:03:15 crc kubenswrapper[4936]: I0930 14:03:15.663698 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"457e01f218ea33201e6960e8655bf0ec98241805b95d5c9bc89bc95ea1eceac0"} err="failed to get container status \"457e01f218ea33201e6960e8655bf0ec98241805b95d5c9bc89bc95ea1eceac0\": rpc error: code = NotFound desc = could not find container \"457e01f218ea33201e6960e8655bf0ec98241805b95d5c9bc89bc95ea1eceac0\": container with ID starting with 457e01f218ea33201e6960e8655bf0ec98241805b95d5c9bc89bc95ea1eceac0 not found: ID does not exist" Sep 30 14:03:16 crc kubenswrapper[4936]: I0930 14:03:16.324590 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="404b119c-ac63-4bd5-adfd-022409dd83b2" path="/var/lib/kubelet/pods/404b119c-ac63-4bd5-adfd-022409dd83b2/volumes" Sep 30 14:03:29 crc kubenswrapper[4936]: I0930 14:03:29.458302 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l9qf7"] Sep 30 14:03:29 crc kubenswrapper[4936]: E0930 14:03:29.463931 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="404b119c-ac63-4bd5-adfd-022409dd83b2" containerName="extract-content" Sep 30 14:03:29 crc kubenswrapper[4936]: I0930 14:03:29.463966 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="404b119c-ac63-4bd5-adfd-022409dd83b2" containerName="extract-content" Sep 30 14:03:29 crc kubenswrapper[4936]: E0930 14:03:29.463993 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="404b119c-ac63-4bd5-adfd-022409dd83b2" containerName="registry-server" Sep 30 14:03:29 crc kubenswrapper[4936]: I0930 14:03:29.464002 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="404b119c-ac63-4bd5-adfd-022409dd83b2" containerName="registry-server" Sep 30 14:03:29 crc kubenswrapper[4936]: E0930 14:03:29.464022 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="404b119c-ac63-4bd5-adfd-022409dd83b2" containerName="extract-utilities" Sep 30 14:03:29 crc kubenswrapper[4936]: I0930 14:03:29.464030 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="404b119c-ac63-4bd5-adfd-022409dd83b2" containerName="extract-utilities" Sep 30 14:03:29 crc kubenswrapper[4936]: I0930 14:03:29.464252 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="404b119c-ac63-4bd5-adfd-022409dd83b2" containerName="registry-server" Sep 30 14:03:29 crc kubenswrapper[4936]: I0930 14:03:29.466013 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l9qf7" Sep 30 14:03:29 crc kubenswrapper[4936]: I0930 14:03:29.473081 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l9qf7"] Sep 30 14:03:29 crc kubenswrapper[4936]: I0930 14:03:29.594254 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef3c739-67c0-45b6-aa90-1dcff8532c67-catalog-content\") pod \"redhat-marketplace-l9qf7\" (UID: \"cef3c739-67c0-45b6-aa90-1dcff8532c67\") " pod="openshift-marketplace/redhat-marketplace-l9qf7" Sep 30 14:03:29 crc kubenswrapper[4936]: I0930 14:03:29.594785 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h586c\" (UniqueName: \"kubernetes.io/projected/cef3c739-67c0-45b6-aa90-1dcff8532c67-kube-api-access-h586c\") pod \"redhat-marketplace-l9qf7\" (UID: \"cef3c739-67c0-45b6-aa90-1dcff8532c67\") " pod="openshift-marketplace/redhat-marketplace-l9qf7" Sep 30 14:03:29 crc kubenswrapper[4936]: I0930 14:03:29.594945 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef3c739-67c0-45b6-aa90-1dcff8532c67-utilities\") pod \"redhat-marketplace-l9qf7\" (UID: \"cef3c739-67c0-45b6-aa90-1dcff8532c67\") " pod="openshift-marketplace/redhat-marketplace-l9qf7" Sep 30 14:03:29 crc kubenswrapper[4936]: I0930 14:03:29.696746 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef3c739-67c0-45b6-aa90-1dcff8532c67-utilities\") pod \"redhat-marketplace-l9qf7\" (UID: \"cef3c739-67c0-45b6-aa90-1dcff8532c67\") " pod="openshift-marketplace/redhat-marketplace-l9qf7" Sep 30 14:03:29 crc kubenswrapper[4936]: I0930 14:03:29.696869 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef3c739-67c0-45b6-aa90-1dcff8532c67-catalog-content\") pod \"redhat-marketplace-l9qf7\" (UID: \"cef3c739-67c0-45b6-aa90-1dcff8532c67\") " pod="openshift-marketplace/redhat-marketplace-l9qf7" Sep 30 14:03:29 crc kubenswrapper[4936]: I0930 14:03:29.696928 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h586c\" (UniqueName: \"kubernetes.io/projected/cef3c739-67c0-45b6-aa90-1dcff8532c67-kube-api-access-h586c\") pod \"redhat-marketplace-l9qf7\" (UID: \"cef3c739-67c0-45b6-aa90-1dcff8532c67\") " pod="openshift-marketplace/redhat-marketplace-l9qf7" Sep 30 14:03:29 crc kubenswrapper[4936]: I0930 14:03:29.697636 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef3c739-67c0-45b6-aa90-1dcff8532c67-utilities\") pod \"redhat-marketplace-l9qf7\" (UID: \"cef3c739-67c0-45b6-aa90-1dcff8532c67\") " pod="openshift-marketplace/redhat-marketplace-l9qf7" Sep 30 14:03:29 crc kubenswrapper[4936]: I0930 14:03:29.697697 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef3c739-67c0-45b6-aa90-1dcff8532c67-catalog-content\") pod \"redhat-marketplace-l9qf7\" (UID: \"cef3c739-67c0-45b6-aa90-1dcff8532c67\") " pod="openshift-marketplace/redhat-marketplace-l9qf7" Sep 30 14:03:29 crc kubenswrapper[4936]: I0930 14:03:29.718578 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h586c\" (UniqueName: \"kubernetes.io/projected/cef3c739-67c0-45b6-aa90-1dcff8532c67-kube-api-access-h586c\") pod \"redhat-marketplace-l9qf7\" (UID: \"cef3c739-67c0-45b6-aa90-1dcff8532c67\") " pod="openshift-marketplace/redhat-marketplace-l9qf7" Sep 30 14:03:29 crc kubenswrapper[4936]: I0930 14:03:29.795457 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l9qf7" Sep 30 14:03:30 crc kubenswrapper[4936]: I0930 14:03:30.282124 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l9qf7"] Sep 30 14:03:30 crc kubenswrapper[4936]: W0930 14:03:30.298630 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcef3c739_67c0_45b6_aa90_1dcff8532c67.slice/crio-21b1f063a0e346bff9e035fcf41a9f78277e6c1676ffcd07107f3df52c4fc943 WatchSource:0}: Error finding container 21b1f063a0e346bff9e035fcf41a9f78277e6c1676ffcd07107f3df52c4fc943: Status 404 returned error can't find the container with id 21b1f063a0e346bff9e035fcf41a9f78277e6c1676ffcd07107f3df52c4fc943 Sep 30 14:03:30 crc kubenswrapper[4936]: I0930 14:03:30.694815 4936 generic.go:334] "Generic (PLEG): container finished" podID="cef3c739-67c0-45b6-aa90-1dcff8532c67" containerID="13c20f3a2fb5376ede1a6bce77616be072fe537f894473d33825f34a70b75eb9" exitCode=0 Sep 30 14:03:30 crc kubenswrapper[4936]: I0930 14:03:30.694866 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9qf7" event={"ID":"cef3c739-67c0-45b6-aa90-1dcff8532c67","Type":"ContainerDied","Data":"13c20f3a2fb5376ede1a6bce77616be072fe537f894473d33825f34a70b75eb9"} Sep 30 14:03:30 crc kubenswrapper[4936]: I0930 14:03:30.695116 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9qf7" event={"ID":"cef3c739-67c0-45b6-aa90-1dcff8532c67","Type":"ContainerStarted","Data":"21b1f063a0e346bff9e035fcf41a9f78277e6c1676ffcd07107f3df52c4fc943"} Sep 30 14:03:31 crc kubenswrapper[4936]: I0930 14:03:31.706014 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9qf7" event={"ID":"cef3c739-67c0-45b6-aa90-1dcff8532c67","Type":"ContainerStarted","Data":"906de21aa9d7007dd12be1f66820d230ec40120f5332897eb9f8ef69da9cd694"} Sep 30 14:03:32 crc kubenswrapper[4936]: I0930 14:03:32.719893 4936 generic.go:334] "Generic (PLEG): container finished" podID="cef3c739-67c0-45b6-aa90-1dcff8532c67" containerID="906de21aa9d7007dd12be1f66820d230ec40120f5332897eb9f8ef69da9cd694" exitCode=0 Sep 30 14:03:32 crc kubenswrapper[4936]: I0930 14:03:32.719991 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9qf7" event={"ID":"cef3c739-67c0-45b6-aa90-1dcff8532c67","Type":"ContainerDied","Data":"906de21aa9d7007dd12be1f66820d230ec40120f5332897eb9f8ef69da9cd694"} Sep 30 14:03:33 crc kubenswrapper[4936]: I0930 14:03:33.730815 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9qf7" event={"ID":"cef3c739-67c0-45b6-aa90-1dcff8532c67","Type":"ContainerStarted","Data":"ad9edfbf51e24d9180b8504698b7da57a0cf7d2b92e461617da3955ea3362be2"} Sep 30 14:03:33 crc kubenswrapper[4936]: I0930 14:03:33.753012 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l9qf7" podStartSLOduration=2.327765456 podStartE2EDuration="4.752992408s" podCreationTimestamp="2025-09-30 14:03:29 +0000 UTC" firstStartedPulling="2025-09-30 14:03:30.697272646 +0000 UTC m=+1461.081274947" lastFinishedPulling="2025-09-30 14:03:33.122499598 +0000 UTC m=+1463.506501899" observedRunningTime="2025-09-30 14:03:33.747465696 +0000 UTC m=+1464.131468007" watchObservedRunningTime="2025-09-30 14:03:33.752992408 +0000 UTC m=+1464.136994709" Sep 30 14:03:36 crc kubenswrapper[4936]: I0930 14:03:36.079906 4936 scope.go:117] "RemoveContainer" containerID="71b9706e96c820d8304f3e7e22e31b2d176d7f00c9388602fd5a4adf8315eedc" Sep 30 14:03:39 crc kubenswrapper[4936]: I0930 14:03:39.797321 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l9qf7" Sep 30 14:03:39 crc kubenswrapper[4936]: I0930 14:03:39.798007 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l9qf7" Sep 30 14:03:39 crc kubenswrapper[4936]: I0930 14:03:39.862274 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l9qf7" Sep 30 14:03:40 crc kubenswrapper[4936]: I0930 14:03:40.831869 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l9qf7" Sep 30 14:03:40 crc kubenswrapper[4936]: I0930 14:03:40.881589 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l9qf7"] Sep 30 14:03:42 crc kubenswrapper[4936]: I0930 14:03:42.805196 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l9qf7" podUID="cef3c739-67c0-45b6-aa90-1dcff8532c67" containerName="registry-server" containerID="cri-o://ad9edfbf51e24d9180b8504698b7da57a0cf7d2b92e461617da3955ea3362be2" gracePeriod=2 Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.245499 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l9qf7" Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.351808 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef3c739-67c0-45b6-aa90-1dcff8532c67-utilities\") pod \"cef3c739-67c0-45b6-aa90-1dcff8532c67\" (UID: \"cef3c739-67c0-45b6-aa90-1dcff8532c67\") " Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.351897 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef3c739-67c0-45b6-aa90-1dcff8532c67-catalog-content\") pod \"cef3c739-67c0-45b6-aa90-1dcff8532c67\" (UID: \"cef3c739-67c0-45b6-aa90-1dcff8532c67\") " Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.352030 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h586c\" (UniqueName: \"kubernetes.io/projected/cef3c739-67c0-45b6-aa90-1dcff8532c67-kube-api-access-h586c\") pod \"cef3c739-67c0-45b6-aa90-1dcff8532c67\" (UID: \"cef3c739-67c0-45b6-aa90-1dcff8532c67\") " Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.352855 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cef3c739-67c0-45b6-aa90-1dcff8532c67-utilities" (OuterVolumeSpecName: "utilities") pod "cef3c739-67c0-45b6-aa90-1dcff8532c67" (UID: "cef3c739-67c0-45b6-aa90-1dcff8532c67"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.358559 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cef3c739-67c0-45b6-aa90-1dcff8532c67-kube-api-access-h586c" (OuterVolumeSpecName: "kube-api-access-h586c") pod "cef3c739-67c0-45b6-aa90-1dcff8532c67" (UID: "cef3c739-67c0-45b6-aa90-1dcff8532c67"). InnerVolumeSpecName "kube-api-access-h586c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.366048 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cef3c739-67c0-45b6-aa90-1dcff8532c67-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cef3c739-67c0-45b6-aa90-1dcff8532c67" (UID: "cef3c739-67c0-45b6-aa90-1dcff8532c67"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.454569 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h586c\" (UniqueName: \"kubernetes.io/projected/cef3c739-67c0-45b6-aa90-1dcff8532c67-kube-api-access-h586c\") on node \"crc\" DevicePath \"\"" Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.454614 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef3c739-67c0-45b6-aa90-1dcff8532c67-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.454629 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef3c739-67c0-45b6-aa90-1dcff8532c67-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.815707 4936 generic.go:334] "Generic (PLEG): container finished" podID="cef3c739-67c0-45b6-aa90-1dcff8532c67" containerID="ad9edfbf51e24d9180b8504698b7da57a0cf7d2b92e461617da3955ea3362be2" exitCode=0 Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.815772 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l9qf7" Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.817310 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9qf7" event={"ID":"cef3c739-67c0-45b6-aa90-1dcff8532c67","Type":"ContainerDied","Data":"ad9edfbf51e24d9180b8504698b7da57a0cf7d2b92e461617da3955ea3362be2"} Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.817424 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9qf7" event={"ID":"cef3c739-67c0-45b6-aa90-1dcff8532c67","Type":"ContainerDied","Data":"21b1f063a0e346bff9e035fcf41a9f78277e6c1676ffcd07107f3df52c4fc943"} Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.817521 4936 scope.go:117] "RemoveContainer" containerID="ad9edfbf51e24d9180b8504698b7da57a0cf7d2b92e461617da3955ea3362be2" Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.840522 4936 scope.go:117] "RemoveContainer" containerID="906de21aa9d7007dd12be1f66820d230ec40120f5332897eb9f8ef69da9cd694" Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.874177 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l9qf7"] Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.878129 4936 scope.go:117] "RemoveContainer" containerID="13c20f3a2fb5376ede1a6bce77616be072fe537f894473d33825f34a70b75eb9" Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.884656 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l9qf7"] Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.918139 4936 scope.go:117] "RemoveContainer" containerID="ad9edfbf51e24d9180b8504698b7da57a0cf7d2b92e461617da3955ea3362be2" Sep 30 14:03:43 crc kubenswrapper[4936]: E0930 14:03:43.918705 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad9edfbf51e24d9180b8504698b7da57a0cf7d2b92e461617da3955ea3362be2\": container with ID starting with ad9edfbf51e24d9180b8504698b7da57a0cf7d2b92e461617da3955ea3362be2 not found: ID does not exist" containerID="ad9edfbf51e24d9180b8504698b7da57a0cf7d2b92e461617da3955ea3362be2" Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.918760 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad9edfbf51e24d9180b8504698b7da57a0cf7d2b92e461617da3955ea3362be2"} err="failed to get container status \"ad9edfbf51e24d9180b8504698b7da57a0cf7d2b92e461617da3955ea3362be2\": rpc error: code = NotFound desc = could not find container \"ad9edfbf51e24d9180b8504698b7da57a0cf7d2b92e461617da3955ea3362be2\": container with ID starting with ad9edfbf51e24d9180b8504698b7da57a0cf7d2b92e461617da3955ea3362be2 not found: ID does not exist" Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.918793 4936 scope.go:117] "RemoveContainer" containerID="906de21aa9d7007dd12be1f66820d230ec40120f5332897eb9f8ef69da9cd694" Sep 30 14:03:43 crc kubenswrapper[4936]: E0930 14:03:43.919150 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"906de21aa9d7007dd12be1f66820d230ec40120f5332897eb9f8ef69da9cd694\": container with ID starting with 906de21aa9d7007dd12be1f66820d230ec40120f5332897eb9f8ef69da9cd694 not found: ID does not exist" containerID="906de21aa9d7007dd12be1f66820d230ec40120f5332897eb9f8ef69da9cd694" Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.919274 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906de21aa9d7007dd12be1f66820d230ec40120f5332897eb9f8ef69da9cd694"} err="failed to get container status \"906de21aa9d7007dd12be1f66820d230ec40120f5332897eb9f8ef69da9cd694\": rpc error: code = NotFound desc = could not find container \"906de21aa9d7007dd12be1f66820d230ec40120f5332897eb9f8ef69da9cd694\": container with ID starting with 906de21aa9d7007dd12be1f66820d230ec40120f5332897eb9f8ef69da9cd694 not found: ID does not exist" Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.919311 4936 scope.go:117] "RemoveContainer" containerID="13c20f3a2fb5376ede1a6bce77616be072fe537f894473d33825f34a70b75eb9" Sep 30 14:03:43 crc kubenswrapper[4936]: E0930 14:03:43.919628 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13c20f3a2fb5376ede1a6bce77616be072fe537f894473d33825f34a70b75eb9\": container with ID starting with 13c20f3a2fb5376ede1a6bce77616be072fe537f894473d33825f34a70b75eb9 not found: ID does not exist" containerID="13c20f3a2fb5376ede1a6bce77616be072fe537f894473d33825f34a70b75eb9" Sep 30 14:03:43 crc kubenswrapper[4936]: I0930 14:03:43.919681 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13c20f3a2fb5376ede1a6bce77616be072fe537f894473d33825f34a70b75eb9"} err="failed to get container status \"13c20f3a2fb5376ede1a6bce77616be072fe537f894473d33825f34a70b75eb9\": rpc error: code = NotFound desc = could not find container \"13c20f3a2fb5376ede1a6bce77616be072fe537f894473d33825f34a70b75eb9\": container with ID starting with 13c20f3a2fb5376ede1a6bce77616be072fe537f894473d33825f34a70b75eb9 not found: ID does not exist" Sep 30 14:03:44 crc kubenswrapper[4936]: I0930 14:03:44.334840 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cef3c739-67c0-45b6-aa90-1dcff8532c67" path="/var/lib/kubelet/pods/cef3c739-67c0-45b6-aa90-1dcff8532c67/volumes" Sep 30 14:03:48 crc kubenswrapper[4936]: I0930 14:03:48.249998 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:03:48 crc kubenswrapper[4936]: I0930 14:03:48.250982 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:03:55 crc kubenswrapper[4936]: I0930 14:03:55.910351 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-27skk"] Sep 30 14:03:55 crc kubenswrapper[4936]: E0930 14:03:55.911419 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef3c739-67c0-45b6-aa90-1dcff8532c67" containerName="registry-server" Sep 30 14:03:55 crc kubenswrapper[4936]: I0930 14:03:55.911436 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef3c739-67c0-45b6-aa90-1dcff8532c67" containerName="registry-server" Sep 30 14:03:55 crc kubenswrapper[4936]: E0930 14:03:55.911462 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef3c739-67c0-45b6-aa90-1dcff8532c67" containerName="extract-content" Sep 30 14:03:55 crc kubenswrapper[4936]: I0930 14:03:55.911470 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef3c739-67c0-45b6-aa90-1dcff8532c67" containerName="extract-content" Sep 30 14:03:55 crc kubenswrapper[4936]: E0930 14:03:55.911495 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef3c739-67c0-45b6-aa90-1dcff8532c67" containerName="extract-utilities" Sep 30 14:03:55 crc kubenswrapper[4936]: I0930 14:03:55.911505 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef3c739-67c0-45b6-aa90-1dcff8532c67" containerName="extract-utilities" Sep 30 14:03:55 crc kubenswrapper[4936]: I0930 14:03:55.911731 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef3c739-67c0-45b6-aa90-1dcff8532c67" containerName="registry-server" Sep 30 14:03:55 crc kubenswrapper[4936]: I0930 14:03:55.913418 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27skk" Sep 30 14:03:55 crc kubenswrapper[4936]: I0930 14:03:55.924543 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-27skk"] Sep 30 14:03:55 crc kubenswrapper[4936]: I0930 14:03:55.982286 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3559bde6-7452-477a-a90f-f07788fd90b5-catalog-content\") pod \"certified-operators-27skk\" (UID: \"3559bde6-7452-477a-a90f-f07788fd90b5\") " pod="openshift-marketplace/certified-operators-27skk" Sep 30 14:03:55 crc kubenswrapper[4936]: I0930 14:03:55.982372 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3559bde6-7452-477a-a90f-f07788fd90b5-utilities\") pod \"certified-operators-27skk\" (UID: \"3559bde6-7452-477a-a90f-f07788fd90b5\") " pod="openshift-marketplace/certified-operators-27skk" Sep 30 14:03:55 crc kubenswrapper[4936]: I0930 14:03:55.982554 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96n2z\" (UniqueName: \"kubernetes.io/projected/3559bde6-7452-477a-a90f-f07788fd90b5-kube-api-access-96n2z\") pod \"certified-operators-27skk\" (UID: \"3559bde6-7452-477a-a90f-f07788fd90b5\") " pod="openshift-marketplace/certified-operators-27skk" Sep 30 14:03:56 crc kubenswrapper[4936]: I0930 14:03:56.083753 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96n2z\" (UniqueName: \"kubernetes.io/projected/3559bde6-7452-477a-a90f-f07788fd90b5-kube-api-access-96n2z\") pod \"certified-operators-27skk\" (UID: \"3559bde6-7452-477a-a90f-f07788fd90b5\") " pod="openshift-marketplace/certified-operators-27skk" Sep 30 14:03:56 crc kubenswrapper[4936]: I0930 14:03:56.084033 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3559bde6-7452-477a-a90f-f07788fd90b5-catalog-content\") pod \"certified-operators-27skk\" (UID: \"3559bde6-7452-477a-a90f-f07788fd90b5\") " pod="openshift-marketplace/certified-operators-27skk" Sep 30 14:03:56 crc kubenswrapper[4936]: I0930 14:03:56.084110 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3559bde6-7452-477a-a90f-f07788fd90b5-utilities\") pod \"certified-operators-27skk\" (UID: \"3559bde6-7452-477a-a90f-f07788fd90b5\") " pod="openshift-marketplace/certified-operators-27skk" Sep 30 14:03:56 crc kubenswrapper[4936]: I0930 14:03:56.084660 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3559bde6-7452-477a-a90f-f07788fd90b5-utilities\") pod \"certified-operators-27skk\" (UID: \"3559bde6-7452-477a-a90f-f07788fd90b5\") " pod="openshift-marketplace/certified-operators-27skk" Sep 30 14:03:56 crc kubenswrapper[4936]: I0930 14:03:56.085224 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3559bde6-7452-477a-a90f-f07788fd90b5-catalog-content\") pod \"certified-operators-27skk\" (UID: \"3559bde6-7452-477a-a90f-f07788fd90b5\") " pod="openshift-marketplace/certified-operators-27skk" Sep 30 14:03:56 crc kubenswrapper[4936]: I0930 14:03:56.114269 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96n2z\" (UniqueName: \"kubernetes.io/projected/3559bde6-7452-477a-a90f-f07788fd90b5-kube-api-access-96n2z\") pod \"certified-operators-27skk\" (UID: \"3559bde6-7452-477a-a90f-f07788fd90b5\") " pod="openshift-marketplace/certified-operators-27skk" Sep 30 14:03:56 crc kubenswrapper[4936]: I0930 14:03:56.242892 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27skk" Sep 30 14:03:56 crc kubenswrapper[4936]: I0930 14:03:56.789943 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-27skk"] Sep 30 14:03:56 crc kubenswrapper[4936]: I0930 14:03:56.931050 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27skk" event={"ID":"3559bde6-7452-477a-a90f-f07788fd90b5","Type":"ContainerStarted","Data":"c357d54c7c2b284d1e1394805b611d254c67c383d8353611abf80548507d86dd"} Sep 30 14:03:57 crc kubenswrapper[4936]: I0930 14:03:57.941384 4936 generic.go:334] "Generic (PLEG): container finished" podID="3559bde6-7452-477a-a90f-f07788fd90b5" containerID="17f13edf7c35d8e3dc117ac78ac8c2e4ff203ea73b6e36c86012e2bc155f995a" exitCode=0 Sep 30 14:03:57 crc kubenswrapper[4936]: I0930 14:03:57.941487 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27skk" event={"ID":"3559bde6-7452-477a-a90f-f07788fd90b5","Type":"ContainerDied","Data":"17f13edf7c35d8e3dc117ac78ac8c2e4ff203ea73b6e36c86012e2bc155f995a"} Sep 30 14:03:58 crc kubenswrapper[4936]: I0930 14:03:58.951645 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27skk" event={"ID":"3559bde6-7452-477a-a90f-f07788fd90b5","Type":"ContainerStarted","Data":"1393841cd6881db1be1fa8439315189e971e47a27a7f4bd8c90b588307dfc34b"} Sep 30 14:04:00 crc kubenswrapper[4936]: I0930 14:04:00.969718 4936 generic.go:334] "Generic (PLEG): container finished" podID="3559bde6-7452-477a-a90f-f07788fd90b5" containerID="1393841cd6881db1be1fa8439315189e971e47a27a7f4bd8c90b588307dfc34b" exitCode=0 Sep 30 14:04:00 crc kubenswrapper[4936]: I0930 14:04:00.969843 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27skk" event={"ID":"3559bde6-7452-477a-a90f-f07788fd90b5","Type":"ContainerDied","Data":"1393841cd6881db1be1fa8439315189e971e47a27a7f4bd8c90b588307dfc34b"} Sep 30 14:04:01 crc kubenswrapper[4936]: I0930 14:04:01.978922 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27skk" event={"ID":"3559bde6-7452-477a-a90f-f07788fd90b5","Type":"ContainerStarted","Data":"c71f93724095cdea986ef4bcfc64741304ceccf3c771494136cfbb0db0e2ccfd"} Sep 30 14:04:01 crc kubenswrapper[4936]: I0930 14:04:01.997221 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-27skk" podStartSLOduration=3.244187383 podStartE2EDuration="6.997199107s" podCreationTimestamp="2025-09-30 14:03:55 +0000 UTC" firstStartedPulling="2025-09-30 14:03:57.943411442 +0000 UTC m=+1488.327413743" lastFinishedPulling="2025-09-30 14:04:01.696423166 +0000 UTC m=+1492.080425467" observedRunningTime="2025-09-30 14:04:01.996652302 +0000 UTC m=+1492.380654603" watchObservedRunningTime="2025-09-30 14:04:01.997199107 +0000 UTC m=+1492.381201408" Sep 30 14:04:06 crc kubenswrapper[4936]: I0930 14:04:06.243110 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-27skk" Sep 30 14:04:06 crc kubenswrapper[4936]: I0930 14:04:06.243644 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-27skk" Sep 30 14:04:06 crc kubenswrapper[4936]: I0930 14:04:06.302887 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-27skk" Sep 30 14:04:07 crc kubenswrapper[4936]: I0930 14:04:07.060208 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-27skk" Sep 30 14:04:07 crc kubenswrapper[4936]: I0930 14:04:07.107764 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-27skk"] Sep 30 14:04:09 crc kubenswrapper[4936]: I0930 14:04:09.034105 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-27skk" podUID="3559bde6-7452-477a-a90f-f07788fd90b5" containerName="registry-server" containerID="cri-o://c71f93724095cdea986ef4bcfc64741304ceccf3c771494136cfbb0db0e2ccfd" gracePeriod=2 Sep 30 14:04:09 crc kubenswrapper[4936]: I0930 14:04:09.466513 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27skk" Sep 30 14:04:09 crc kubenswrapper[4936]: I0930 14:04:09.651860 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96n2z\" (UniqueName: \"kubernetes.io/projected/3559bde6-7452-477a-a90f-f07788fd90b5-kube-api-access-96n2z\") pod \"3559bde6-7452-477a-a90f-f07788fd90b5\" (UID: \"3559bde6-7452-477a-a90f-f07788fd90b5\") " Sep 30 14:04:09 crc kubenswrapper[4936]: I0930 14:04:09.652038 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3559bde6-7452-477a-a90f-f07788fd90b5-utilities\") pod \"3559bde6-7452-477a-a90f-f07788fd90b5\" (UID: \"3559bde6-7452-477a-a90f-f07788fd90b5\") " Sep 30 14:04:09 crc kubenswrapper[4936]: I0930 14:04:09.652079 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3559bde6-7452-477a-a90f-f07788fd90b5-catalog-content\") pod \"3559bde6-7452-477a-a90f-f07788fd90b5\" (UID: \"3559bde6-7452-477a-a90f-f07788fd90b5\") " Sep 30 14:04:09 crc kubenswrapper[4936]: I0930 14:04:09.653707 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3559bde6-7452-477a-a90f-f07788fd90b5-utilities" (OuterVolumeSpecName: "utilities") pod "3559bde6-7452-477a-a90f-f07788fd90b5" (UID: "3559bde6-7452-477a-a90f-f07788fd90b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:04:09 crc kubenswrapper[4936]: I0930 14:04:09.663717 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3559bde6-7452-477a-a90f-f07788fd90b5-kube-api-access-96n2z" (OuterVolumeSpecName: "kube-api-access-96n2z") pod "3559bde6-7452-477a-a90f-f07788fd90b5" (UID: "3559bde6-7452-477a-a90f-f07788fd90b5"). InnerVolumeSpecName "kube-api-access-96n2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:04:09 crc kubenswrapper[4936]: I0930 14:04:09.754907 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96n2z\" (UniqueName: \"kubernetes.io/projected/3559bde6-7452-477a-a90f-f07788fd90b5-kube-api-access-96n2z\") on node \"crc\" DevicePath \"\"" Sep 30 14:04:09 crc kubenswrapper[4936]: I0930 14:04:09.754951 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3559bde6-7452-477a-a90f-f07788fd90b5-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:04:09 crc kubenswrapper[4936]: I0930 14:04:09.796778 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3559bde6-7452-477a-a90f-f07788fd90b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3559bde6-7452-477a-a90f-f07788fd90b5" (UID: "3559bde6-7452-477a-a90f-f07788fd90b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:04:09 crc kubenswrapper[4936]: I0930 14:04:09.856935 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3559bde6-7452-477a-a90f-f07788fd90b5-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:04:10 crc kubenswrapper[4936]: I0930 14:04:10.044948 4936 generic.go:334] "Generic (PLEG): container finished" podID="3559bde6-7452-477a-a90f-f07788fd90b5" containerID="c71f93724095cdea986ef4bcfc64741304ceccf3c771494136cfbb0db0e2ccfd" exitCode=0 Sep 30 14:04:10 crc kubenswrapper[4936]: I0930 14:04:10.044996 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27skk" event={"ID":"3559bde6-7452-477a-a90f-f07788fd90b5","Type":"ContainerDied","Data":"c71f93724095cdea986ef4bcfc64741304ceccf3c771494136cfbb0db0e2ccfd"} Sep 30 14:04:10 crc kubenswrapper[4936]: I0930 14:04:10.045034 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27skk" event={"ID":"3559bde6-7452-477a-a90f-f07788fd90b5","Type":"ContainerDied","Data":"c357d54c7c2b284d1e1394805b611d254c67c383d8353611abf80548507d86dd"} Sep 30 14:04:10 crc kubenswrapper[4936]: I0930 14:04:10.045024 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27skk" Sep 30 14:04:10 crc kubenswrapper[4936]: I0930 14:04:10.045055 4936 scope.go:117] "RemoveContainer" containerID="c71f93724095cdea986ef4bcfc64741304ceccf3c771494136cfbb0db0e2ccfd" Sep 30 14:04:10 crc kubenswrapper[4936]: I0930 14:04:10.078712 4936 scope.go:117] "RemoveContainer" containerID="1393841cd6881db1be1fa8439315189e971e47a27a7f4bd8c90b588307dfc34b" Sep 30 14:04:10 crc kubenswrapper[4936]: I0930 14:04:10.084425 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-27skk"] Sep 30 14:04:10 crc kubenswrapper[4936]: I0930 14:04:10.107111 4936 scope.go:117] "RemoveContainer" containerID="17f13edf7c35d8e3dc117ac78ac8c2e4ff203ea73b6e36c86012e2bc155f995a" Sep 30 14:04:10 crc kubenswrapper[4936]: I0930 14:04:10.111541 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-27skk"] Sep 30 14:04:10 crc kubenswrapper[4936]: I0930 14:04:10.151591 4936 scope.go:117] "RemoveContainer" containerID="c71f93724095cdea986ef4bcfc64741304ceccf3c771494136cfbb0db0e2ccfd" Sep 30 14:04:10 crc kubenswrapper[4936]: E0930 14:04:10.152149 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c71f93724095cdea986ef4bcfc64741304ceccf3c771494136cfbb0db0e2ccfd\": container with ID starting with c71f93724095cdea986ef4bcfc64741304ceccf3c771494136cfbb0db0e2ccfd not found: ID does not exist" containerID="c71f93724095cdea986ef4bcfc64741304ceccf3c771494136cfbb0db0e2ccfd" Sep 30 14:04:10 crc kubenswrapper[4936]: I0930 14:04:10.152176 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c71f93724095cdea986ef4bcfc64741304ceccf3c771494136cfbb0db0e2ccfd"} err="failed to get container status \"c71f93724095cdea986ef4bcfc64741304ceccf3c771494136cfbb0db0e2ccfd\": rpc error: code = NotFound desc = could not find container \"c71f93724095cdea986ef4bcfc64741304ceccf3c771494136cfbb0db0e2ccfd\": container with ID starting with c71f93724095cdea986ef4bcfc64741304ceccf3c771494136cfbb0db0e2ccfd not found: ID does not exist" Sep 30 14:04:10 crc kubenswrapper[4936]: I0930 14:04:10.152197 4936 scope.go:117] "RemoveContainer" containerID="1393841cd6881db1be1fa8439315189e971e47a27a7f4bd8c90b588307dfc34b" Sep 30 14:04:10 crc kubenswrapper[4936]: E0930 14:04:10.152739 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1393841cd6881db1be1fa8439315189e971e47a27a7f4bd8c90b588307dfc34b\": container with ID starting with 1393841cd6881db1be1fa8439315189e971e47a27a7f4bd8c90b588307dfc34b not found: ID does not exist" containerID="1393841cd6881db1be1fa8439315189e971e47a27a7f4bd8c90b588307dfc34b" Sep 30 14:04:10 crc kubenswrapper[4936]: I0930 14:04:10.152768 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1393841cd6881db1be1fa8439315189e971e47a27a7f4bd8c90b588307dfc34b"} err="failed to get container status \"1393841cd6881db1be1fa8439315189e971e47a27a7f4bd8c90b588307dfc34b\": rpc error: code = NotFound desc = could not find container \"1393841cd6881db1be1fa8439315189e971e47a27a7f4bd8c90b588307dfc34b\": container with ID starting with 1393841cd6881db1be1fa8439315189e971e47a27a7f4bd8c90b588307dfc34b not found: ID does not exist" Sep 30 14:04:10 crc kubenswrapper[4936]: I0930 14:04:10.152788 4936 scope.go:117] "RemoveContainer" containerID="17f13edf7c35d8e3dc117ac78ac8c2e4ff203ea73b6e36c86012e2bc155f995a" Sep 30 14:04:10 crc kubenswrapper[4936]: E0930 14:04:10.153161 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17f13edf7c35d8e3dc117ac78ac8c2e4ff203ea73b6e36c86012e2bc155f995a\": container with ID starting with 17f13edf7c35d8e3dc117ac78ac8c2e4ff203ea73b6e36c86012e2bc155f995a not found: ID does not exist" containerID="17f13edf7c35d8e3dc117ac78ac8c2e4ff203ea73b6e36c86012e2bc155f995a" Sep 30 14:04:10 crc kubenswrapper[4936]: I0930 14:04:10.153188 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f13edf7c35d8e3dc117ac78ac8c2e4ff203ea73b6e36c86012e2bc155f995a"} err="failed to get container status \"17f13edf7c35d8e3dc117ac78ac8c2e4ff203ea73b6e36c86012e2bc155f995a\": rpc error: code = NotFound desc = could not find container \"17f13edf7c35d8e3dc117ac78ac8c2e4ff203ea73b6e36c86012e2bc155f995a\": container with ID starting with 17f13edf7c35d8e3dc117ac78ac8c2e4ff203ea73b6e36c86012e2bc155f995a not found: ID does not exist" Sep 30 14:04:10 crc kubenswrapper[4936]: I0930 14:04:10.355233 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3559bde6-7452-477a-a90f-f07788fd90b5" path="/var/lib/kubelet/pods/3559bde6-7452-477a-a90f-f07788fd90b5/volumes" Sep 30 14:04:18 crc kubenswrapper[4936]: I0930 14:04:18.250683 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:04:18 crc kubenswrapper[4936]: I0930 14:04:18.252133 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:04:48 crc kubenswrapper[4936]: I0930 14:04:48.250644 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:04:48 crc kubenswrapper[4936]: I0930 14:04:48.251180 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:04:48 crc kubenswrapper[4936]: I0930 14:04:48.251222 4936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 14:04:48 crc kubenswrapper[4936]: I0930 14:04:48.251893 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707"} pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:04:48 crc kubenswrapper[4936]: I0930 14:04:48.251942 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" containerID="cri-o://6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" gracePeriod=600 Sep 30 14:04:48 crc kubenswrapper[4936]: E0930 14:04:48.382215 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:04:49 crc kubenswrapper[4936]: I0930 14:04:49.365244 4936 generic.go:334] "Generic (PLEG): container finished" podID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" exitCode=0 Sep 30 14:04:49 crc kubenswrapper[4936]: I0930 14:04:49.365324 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerDied","Data":"6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707"} Sep 30 14:04:49 crc kubenswrapper[4936]: I0930 14:04:49.365733 4936 scope.go:117] "RemoveContainer" containerID="9925ed581c77513fa67110fdb500bc3893f95ddcc97c621140a1a3e57e9f5628" Sep 30 14:04:49 crc kubenswrapper[4936]: I0930 14:04:49.366401 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:04:49 crc kubenswrapper[4936]: E0930 14:04:49.366667 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:05:02 crc kubenswrapper[4936]: I0930 14:05:02.319061 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:05:02 crc kubenswrapper[4936]: E0930 14:05:02.319872 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:05:17 crc kubenswrapper[4936]: I0930 14:05:17.316888 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:05:17 crc kubenswrapper[4936]: E0930 14:05:17.317776 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:05:28 crc kubenswrapper[4936]: I0930 14:05:28.315012 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:05:28 crc kubenswrapper[4936]: E0930 14:05:28.315781 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:05:41 crc kubenswrapper[4936]: I0930 14:05:41.316161 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:05:41 crc kubenswrapper[4936]: E0930 14:05:41.316972 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:05:48 crc kubenswrapper[4936]: I0930 14:05:48.882665 4936 generic.go:334] "Generic (PLEG): container finished" podID="0aaaa50a-6929-4654-b42b-ccfcd712d106" containerID="0dff08732339e1b583d0b75d3601c0b458cb041aa1b0d04690b3d23b4928983c" exitCode=0 Sep 30 14:05:48 crc kubenswrapper[4936]: I0930 14:05:48.882745 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" event={"ID":"0aaaa50a-6929-4654-b42b-ccfcd712d106","Type":"ContainerDied","Data":"0dff08732339e1b583d0b75d3601c0b458cb041aa1b0d04690b3d23b4928983c"} Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.291396 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.392639 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0aaaa50a-6929-4654-b42b-ccfcd712d106-inventory\") pod \"0aaaa50a-6929-4654-b42b-ccfcd712d106\" (UID: \"0aaaa50a-6929-4654-b42b-ccfcd712d106\") " Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.393046 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0aaaa50a-6929-4654-b42b-ccfcd712d106-ssh-key\") pod \"0aaaa50a-6929-4654-b42b-ccfcd712d106\" (UID: \"0aaaa50a-6929-4654-b42b-ccfcd712d106\") " Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.393197 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9qn7\" (UniqueName: \"kubernetes.io/projected/0aaaa50a-6929-4654-b42b-ccfcd712d106-kube-api-access-n9qn7\") pod \"0aaaa50a-6929-4654-b42b-ccfcd712d106\" (UID: \"0aaaa50a-6929-4654-b42b-ccfcd712d106\") " Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.394173 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aaaa50a-6929-4654-b42b-ccfcd712d106-bootstrap-combined-ca-bundle\") pod \"0aaaa50a-6929-4654-b42b-ccfcd712d106\" (UID: \"0aaaa50a-6929-4654-b42b-ccfcd712d106\") " Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.400596 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aaaa50a-6929-4654-b42b-ccfcd712d106-kube-api-access-n9qn7" (OuterVolumeSpecName: "kube-api-access-n9qn7") pod "0aaaa50a-6929-4654-b42b-ccfcd712d106" (UID: "0aaaa50a-6929-4654-b42b-ccfcd712d106"). InnerVolumeSpecName "kube-api-access-n9qn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.401798 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aaaa50a-6929-4654-b42b-ccfcd712d106-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0aaaa50a-6929-4654-b42b-ccfcd712d106" (UID: "0aaaa50a-6929-4654-b42b-ccfcd712d106"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.425263 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aaaa50a-6929-4654-b42b-ccfcd712d106-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0aaaa50a-6929-4654-b42b-ccfcd712d106" (UID: "0aaaa50a-6929-4654-b42b-ccfcd712d106"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.425651 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aaaa50a-6929-4654-b42b-ccfcd712d106-inventory" (OuterVolumeSpecName: "inventory") pod "0aaaa50a-6929-4654-b42b-ccfcd712d106" (UID: "0aaaa50a-6929-4654-b42b-ccfcd712d106"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.497506 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0aaaa50a-6929-4654-b42b-ccfcd712d106-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.497537 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9qn7\" (UniqueName: \"kubernetes.io/projected/0aaaa50a-6929-4654-b42b-ccfcd712d106-kube-api-access-n9qn7\") on node \"crc\" DevicePath \"\"" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.497550 4936 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aaaa50a-6929-4654-b42b-ccfcd712d106-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.497567 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0aaaa50a-6929-4654-b42b-ccfcd712d106-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.902150 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" event={"ID":"0aaaa50a-6929-4654-b42b-ccfcd712d106","Type":"ContainerDied","Data":"0c4f4cf4e4311790cca4e334a5c70bff789c1faa607b88066cea2400689d5f3c"} Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.902195 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c4f4cf4e4311790cca4e334a5c70bff789c1faa607b88066cea2400689d5f3c" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.902254 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.987585 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5"] Sep 30 14:05:50 crc kubenswrapper[4936]: E0930 14:05:50.988033 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aaaa50a-6929-4654-b42b-ccfcd712d106" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.988067 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aaaa50a-6929-4654-b42b-ccfcd712d106" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 14:05:50 crc kubenswrapper[4936]: E0930 14:05:50.988091 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3559bde6-7452-477a-a90f-f07788fd90b5" containerName="extract-content" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.988100 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3559bde6-7452-477a-a90f-f07788fd90b5" containerName="extract-content" Sep 30 14:05:50 crc kubenswrapper[4936]: E0930 14:05:50.988119 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3559bde6-7452-477a-a90f-f07788fd90b5" containerName="extract-utilities" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.988127 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3559bde6-7452-477a-a90f-f07788fd90b5" containerName="extract-utilities" Sep 30 14:05:50 crc kubenswrapper[4936]: E0930 14:05:50.988147 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3559bde6-7452-477a-a90f-f07788fd90b5" containerName="registry-server" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.988154 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3559bde6-7452-477a-a90f-f07788fd90b5" containerName="registry-server" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.988712 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aaaa50a-6929-4654-b42b-ccfcd712d106" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.988740 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="3559bde6-7452-477a-a90f-f07788fd90b5" containerName="registry-server" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.989546 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.993504 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.993685 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.993876 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:05:50 crc kubenswrapper[4936]: I0930 14:05:50.994012 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:05:51 crc kubenswrapper[4936]: I0930 14:05:51.005495 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5"] Sep 30 14:05:51 crc kubenswrapper[4936]: I0930 14:05:51.105956 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4599dc3b-5593-4f89-a693-7281c98a534e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5\" (UID: \"4599dc3b-5593-4f89-a693-7281c98a534e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5" Sep 30 14:05:51 crc kubenswrapper[4936]: I0930 14:05:51.106025 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4599dc3b-5593-4f89-a693-7281c98a534e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5\" (UID: \"4599dc3b-5593-4f89-a693-7281c98a534e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5" Sep 30 14:05:51 crc kubenswrapper[4936]: I0930 14:05:51.106131 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kplvk\" (UniqueName: \"kubernetes.io/projected/4599dc3b-5593-4f89-a693-7281c98a534e-kube-api-access-kplvk\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5\" (UID: \"4599dc3b-5593-4f89-a693-7281c98a534e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5" Sep 30 14:05:51 crc kubenswrapper[4936]: I0930 14:05:51.207850 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kplvk\" (UniqueName: \"kubernetes.io/projected/4599dc3b-5593-4f89-a693-7281c98a534e-kube-api-access-kplvk\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5\" (UID: \"4599dc3b-5593-4f89-a693-7281c98a534e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5" Sep 30 14:05:51 crc kubenswrapper[4936]: I0930 14:05:51.208510 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4599dc3b-5593-4f89-a693-7281c98a534e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5\" (UID: \"4599dc3b-5593-4f89-a693-7281c98a534e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5" Sep 30 14:05:51 crc kubenswrapper[4936]: I0930 14:05:51.210006 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4599dc3b-5593-4f89-a693-7281c98a534e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5\" (UID: \"4599dc3b-5593-4f89-a693-7281c98a534e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5" Sep 30 14:05:51 crc kubenswrapper[4936]: I0930 14:05:51.214125 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4599dc3b-5593-4f89-a693-7281c98a534e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5\" (UID: \"4599dc3b-5593-4f89-a693-7281c98a534e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5" Sep 30 14:05:51 crc kubenswrapper[4936]: I0930 14:05:51.215426 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4599dc3b-5593-4f89-a693-7281c98a534e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5\" (UID: \"4599dc3b-5593-4f89-a693-7281c98a534e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5" Sep 30 14:05:51 crc kubenswrapper[4936]: I0930 14:05:51.225527 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kplvk\" (UniqueName: \"kubernetes.io/projected/4599dc3b-5593-4f89-a693-7281c98a534e-kube-api-access-kplvk\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5\" (UID: \"4599dc3b-5593-4f89-a693-7281c98a534e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5" Sep 30 14:05:51 crc kubenswrapper[4936]: I0930 14:05:51.309957 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5" Sep 30 14:05:51 crc kubenswrapper[4936]: I0930 14:05:51.828060 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5"] Sep 30 14:05:51 crc kubenswrapper[4936]: I0930 14:05:51.833134 4936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:05:51 crc kubenswrapper[4936]: I0930 14:05:51.911811 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5" event={"ID":"4599dc3b-5593-4f89-a693-7281c98a534e","Type":"ContainerStarted","Data":"a6608ece6122d5e4cdadc51005a9f5643b4aa663e73bd659450064cfead5d66c"} Sep 30 14:05:52 crc kubenswrapper[4936]: I0930 14:05:52.921597 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5" event={"ID":"4599dc3b-5593-4f89-a693-7281c98a534e","Type":"ContainerStarted","Data":"ddd7850344c3ca6ec4eb88ecb4f895b05eb41bce698c15cf7096af9402d00087"} Sep 30 14:05:52 crc kubenswrapper[4936]: I0930 14:05:52.943832 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5" podStartSLOduration=2.517448705 podStartE2EDuration="2.943802208s" podCreationTimestamp="2025-09-30 14:05:50 +0000 UTC" firstStartedPulling="2025-09-30 14:05:51.832943386 +0000 UTC m=+1602.216945697" lastFinishedPulling="2025-09-30 14:05:52.259296899 +0000 UTC m=+1602.643299200" observedRunningTime="2025-09-30 14:05:52.938844001 +0000 UTC m=+1603.322846312" watchObservedRunningTime="2025-09-30 14:05:52.943802208 +0000 UTC m=+1603.327804509" Sep 30 14:05:54 crc kubenswrapper[4936]: I0930 14:05:54.315832 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:05:54 crc kubenswrapper[4936]: E0930 14:05:54.317212 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:06:06 crc kubenswrapper[4936]: I0930 14:06:06.045570 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-bs47q"] Sep 30 14:06:06 crc kubenswrapper[4936]: I0930 14:06:06.052920 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-p5xhr"] Sep 30 14:06:06 crc kubenswrapper[4936]: I0930 14:06:06.062149 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-bs47q"] Sep 30 14:06:06 crc kubenswrapper[4936]: I0930 14:06:06.072043 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-p5xhr"] Sep 30 14:06:06 crc kubenswrapper[4936]: I0930 14:06:06.327423 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15771d44-77f5-4db8-98d0-588ddcef9157" path="/var/lib/kubelet/pods/15771d44-77f5-4db8-98d0-588ddcef9157/volumes" Sep 30 14:06:06 crc kubenswrapper[4936]: I0930 14:06:06.328311 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80c8cc7e-8b32-4662-bf25-0f95cd8aa48e" path="/var/lib/kubelet/pods/80c8cc7e-8b32-4662-bf25-0f95cd8aa48e/volumes" Sep 30 14:06:07 crc kubenswrapper[4936]: I0930 14:06:07.030852 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-gdllr"] Sep 30 14:06:07 crc kubenswrapper[4936]: I0930 14:06:07.039939 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-gdllr"] Sep 30 14:06:07 crc kubenswrapper[4936]: I0930 14:06:07.319496 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:06:07 crc kubenswrapper[4936]: E0930 14:06:07.319895 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:06:08 crc kubenswrapper[4936]: I0930 14:06:08.327750 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996" path="/var/lib/kubelet/pods/98bc6478-04f5-4a0a-a1fc-5f4b7f2ea996/volumes" Sep 30 14:06:16 crc kubenswrapper[4936]: I0930 14:06:16.038127 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3dd9-account-create-5t4zd"] Sep 30 14:06:16 crc kubenswrapper[4936]: I0930 14:06:16.051479 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-f65f-account-create-8wxzh"] Sep 30 14:06:16 crc kubenswrapper[4936]: I0930 14:06:16.061213 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3dd9-account-create-5t4zd"] Sep 30 14:06:16 crc kubenswrapper[4936]: I0930 14:06:16.070517 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-f65f-account-create-8wxzh"] Sep 30 14:06:16 crc kubenswrapper[4936]: I0930 14:06:16.327781 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40017b63-ca9a-49cf-9e10-40ae1ec179e5" path="/var/lib/kubelet/pods/40017b63-ca9a-49cf-9e10-40ae1ec179e5/volumes" Sep 30 14:06:16 crc kubenswrapper[4936]: I0930 14:06:16.328579 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccfcd490-2af5-4e6d-a842-337e4dfd7333" path="/var/lib/kubelet/pods/ccfcd490-2af5-4e6d-a842-337e4dfd7333/volumes" Sep 30 14:06:17 crc kubenswrapper[4936]: I0930 14:06:17.033031 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c9ba-account-create-ntmbh"] Sep 30 14:06:17 crc kubenswrapper[4936]: I0930 14:06:17.043453 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-c9ba-account-create-ntmbh"] Sep 30 14:06:18 crc kubenswrapper[4936]: I0930 14:06:18.327461 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="670c827e-fdbe-47de-8b0f-be94d7668f2c" path="/var/lib/kubelet/pods/670c827e-fdbe-47de-8b0f-be94d7668f2c/volumes" Sep 30 14:06:19 crc kubenswrapper[4936]: I0930 14:06:19.315363 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:06:19 crc kubenswrapper[4936]: E0930 14:06:19.315782 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:06:33 crc kubenswrapper[4936]: I0930 14:06:33.316262 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:06:33 crc kubenswrapper[4936]: E0930 14:06:33.317205 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:06:36 crc kubenswrapper[4936]: I0930 14:06:36.228592 4936 scope.go:117] "RemoveContainer" containerID="654b9df2ff2d0010cb049bcf7ee486cecc64a65db3ef9f0c15bedad1a0f8c669" Sep 30 14:06:36 crc kubenswrapper[4936]: I0930 14:06:36.253417 4936 scope.go:117] "RemoveContainer" containerID="9acf4412ffb072a291b0b1a040b54e051f3a732523c9ffe7c0409b2e4457514a" Sep 30 14:06:36 crc kubenswrapper[4936]: I0930 14:06:36.326645 4936 scope.go:117] "RemoveContainer" containerID="4b837882bbb445147dbafe995ccea4009b68786ba254c56c1951db46f1f0aac9" Sep 30 14:06:36 crc kubenswrapper[4936]: I0930 14:06:36.364712 4936 scope.go:117] "RemoveContainer" containerID="66dcfefdb2eafb4a182a0538c511380c43fb7a29673f850650cad36d67359aba" Sep 30 14:06:36 crc kubenswrapper[4936]: I0930 14:06:36.489636 4936 scope.go:117] "RemoveContainer" containerID="9adb2d3bba33510151914d052fa7503c2eb047b7ab39a9206fd63a662109c13f" Sep 30 14:06:36 crc kubenswrapper[4936]: I0930 14:06:36.514642 4936 scope.go:117] "RemoveContainer" containerID="f49c9cfa268a348031096503f9bddc3c976e9c1b5c055dac12816fbd94ee9864" Sep 30 14:06:37 crc kubenswrapper[4936]: I0930 14:06:37.056523 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-szn75"] Sep 30 14:06:37 crc kubenswrapper[4936]: I0930 14:06:37.066738 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-szn75"] Sep 30 14:06:37 crc kubenswrapper[4936]: I0930 14:06:37.076618 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-j2wgp"] Sep 30 14:06:37 crc kubenswrapper[4936]: I0930 14:06:37.087924 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-j69z9"] Sep 30 14:06:37 crc kubenswrapper[4936]: I0930 14:06:37.095684 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-j2wgp"] Sep 30 14:06:37 crc kubenswrapper[4936]: I0930 14:06:37.102688 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-j69z9"] Sep 30 14:06:38 crc kubenswrapper[4936]: I0930 14:06:38.327205 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="300c85d9-b8e5-4d33-9fd2-86964369fe57" path="/var/lib/kubelet/pods/300c85d9-b8e5-4d33-9fd2-86964369fe57/volumes" Sep 30 14:06:38 crc kubenswrapper[4936]: I0930 14:06:38.328281 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d49846e9-5f5a-4e32-8f59-e9a10e2e98af" path="/var/lib/kubelet/pods/d49846e9-5f5a-4e32-8f59-e9a10e2e98af/volumes" Sep 30 14:06:38 crc kubenswrapper[4936]: I0930 14:06:38.340519 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6a43128-6ad1-47d4-80ea-afcf5532be4f" path="/var/lib/kubelet/pods/f6a43128-6ad1-47d4-80ea-afcf5532be4f/volumes" Sep 30 14:06:44 crc kubenswrapper[4936]: I0930 14:06:44.034487 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-62f6-account-create-275gf"] Sep 30 14:06:44 crc kubenswrapper[4936]: I0930 14:06:44.042910 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-62f6-account-create-275gf"] Sep 30 14:06:44 crc kubenswrapper[4936]: I0930 14:06:44.328817 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8743c649-a846-418d-9946-774e4cfb2553" path="/var/lib/kubelet/pods/8743c649-a846-418d-9946-774e4cfb2553/volumes" Sep 30 14:06:46 crc kubenswrapper[4936]: I0930 14:06:46.316289 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:06:46 crc kubenswrapper[4936]: E0930 14:06:46.316883 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:06:52 crc kubenswrapper[4936]: I0930 14:06:52.039817 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-rwxb8"] Sep 30 14:06:52 crc kubenswrapper[4936]: I0930 14:06:52.048774 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-rwxb8"] Sep 30 14:06:52 crc kubenswrapper[4936]: I0930 14:06:52.326475 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37485d56-bd76-442f-b986-00ccd913d8cb" path="/var/lib/kubelet/pods/37485d56-bd76-442f-b986-00ccd913d8cb/volumes" Sep 30 14:06:54 crc kubenswrapper[4936]: I0930 14:06:54.037993 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-08c3-account-create-cbk2n"] Sep 30 14:06:54 crc kubenswrapper[4936]: I0930 14:06:54.047061 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-l85t5"] Sep 30 14:06:54 crc kubenswrapper[4936]: I0930 14:06:54.057030 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-08c3-account-create-cbk2n"] Sep 30 14:06:54 crc kubenswrapper[4936]: I0930 14:06:54.067111 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-l85t5"] Sep 30 14:06:54 crc kubenswrapper[4936]: I0930 14:06:54.076598 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-9507-account-create-fkds9"] Sep 30 14:06:54 crc kubenswrapper[4936]: I0930 14:06:54.110913 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-9507-account-create-fkds9"] Sep 30 14:06:54 crc kubenswrapper[4936]: I0930 14:06:54.325784 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="660cc428-d532-48bb-8272-d698d7b4b8db" path="/var/lib/kubelet/pods/660cc428-d532-48bb-8272-d698d7b4b8db/volumes" Sep 30 14:06:54 crc kubenswrapper[4936]: I0930 14:06:54.326733 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b5a2293-5cc6-4ee1-92dd-d63c04c1d390" path="/var/lib/kubelet/pods/6b5a2293-5cc6-4ee1-92dd-d63c04c1d390/volumes" Sep 30 14:06:54 crc kubenswrapper[4936]: I0930 14:06:54.330682 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bb49398-578d-45ba-beee-c10d7ecd9b37" path="/var/lib/kubelet/pods/9bb49398-578d-45ba-beee-c10d7ecd9b37/volumes" Sep 30 14:06:58 crc kubenswrapper[4936]: I0930 14:06:58.315394 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:06:58 crc kubenswrapper[4936]: E0930 14:06:58.316192 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:07:11 crc kubenswrapper[4936]: I0930 14:07:11.315303 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:07:11 crc kubenswrapper[4936]: E0930 14:07:11.316088 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:07:17 crc kubenswrapper[4936]: I0930 14:07:17.680628 4936 generic.go:334] "Generic (PLEG): container finished" podID="4599dc3b-5593-4f89-a693-7281c98a534e" containerID="ddd7850344c3ca6ec4eb88ecb4f895b05eb41bce698c15cf7096af9402d00087" exitCode=0 Sep 30 14:07:17 crc kubenswrapper[4936]: I0930 14:07:17.680710 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5" event={"ID":"4599dc3b-5593-4f89-a693-7281c98a534e","Type":"ContainerDied","Data":"ddd7850344c3ca6ec4eb88ecb4f895b05eb41bce698c15cf7096af9402d00087"} Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.087513 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.169997 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4599dc3b-5593-4f89-a693-7281c98a534e-ssh-key\") pod \"4599dc3b-5593-4f89-a693-7281c98a534e\" (UID: \"4599dc3b-5593-4f89-a693-7281c98a534e\") " Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.170089 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4599dc3b-5593-4f89-a693-7281c98a534e-inventory\") pod \"4599dc3b-5593-4f89-a693-7281c98a534e\" (UID: \"4599dc3b-5593-4f89-a693-7281c98a534e\") " Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.170225 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kplvk\" (UniqueName: \"kubernetes.io/projected/4599dc3b-5593-4f89-a693-7281c98a534e-kube-api-access-kplvk\") pod \"4599dc3b-5593-4f89-a693-7281c98a534e\" (UID: \"4599dc3b-5593-4f89-a693-7281c98a534e\") " Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.179150 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4599dc3b-5593-4f89-a693-7281c98a534e-kube-api-access-kplvk" (OuterVolumeSpecName: "kube-api-access-kplvk") pod "4599dc3b-5593-4f89-a693-7281c98a534e" (UID: "4599dc3b-5593-4f89-a693-7281c98a534e"). InnerVolumeSpecName "kube-api-access-kplvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.201404 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4599dc3b-5593-4f89-a693-7281c98a534e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4599dc3b-5593-4f89-a693-7281c98a534e" (UID: "4599dc3b-5593-4f89-a693-7281c98a534e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.203701 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4599dc3b-5593-4f89-a693-7281c98a534e-inventory" (OuterVolumeSpecName: "inventory") pod "4599dc3b-5593-4f89-a693-7281c98a534e" (UID: "4599dc3b-5593-4f89-a693-7281c98a534e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.272476 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4599dc3b-5593-4f89-a693-7281c98a534e-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.272779 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4599dc3b-5593-4f89-a693-7281c98a534e-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.272858 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kplvk\" (UniqueName: \"kubernetes.io/projected/4599dc3b-5593-4f89-a693-7281c98a534e-kube-api-access-kplvk\") on node \"crc\" DevicePath \"\"" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.699209 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5" event={"ID":"4599dc3b-5593-4f89-a693-7281c98a534e","Type":"ContainerDied","Data":"a6608ece6122d5e4cdadc51005a9f5643b4aa663e73bd659450064cfead5d66c"} Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.699598 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6608ece6122d5e4cdadc51005a9f5643b4aa663e73bd659450064cfead5d66c" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.699264 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.787645 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq"] Sep 30 14:07:19 crc kubenswrapper[4936]: E0930 14:07:19.788197 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4599dc3b-5593-4f89-a693-7281c98a534e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.788220 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="4599dc3b-5593-4f89-a693-7281c98a534e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.788455 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="4599dc3b-5593-4f89-a693-7281c98a534e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.789139 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.792681 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.793031 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.793436 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.793693 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.808637 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq"] Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.884356 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4524efd0-030f-4c85-93ee-88a1abfd34f8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mknzq\" (UID: \"4524efd0-030f-4c85-93ee-88a1abfd34f8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.884409 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4524efd0-030f-4c85-93ee-88a1abfd34f8-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mknzq\" (UID: \"4524efd0-030f-4c85-93ee-88a1abfd34f8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.884690 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgg8m\" (UniqueName: \"kubernetes.io/projected/4524efd0-030f-4c85-93ee-88a1abfd34f8-kube-api-access-mgg8m\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mknzq\" (UID: \"4524efd0-030f-4c85-93ee-88a1abfd34f8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.986283 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4524efd0-030f-4c85-93ee-88a1abfd34f8-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mknzq\" (UID: \"4524efd0-030f-4c85-93ee-88a1abfd34f8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.986466 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgg8m\" (UniqueName: \"kubernetes.io/projected/4524efd0-030f-4c85-93ee-88a1abfd34f8-kube-api-access-mgg8m\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mknzq\" (UID: \"4524efd0-030f-4c85-93ee-88a1abfd34f8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.986552 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4524efd0-030f-4c85-93ee-88a1abfd34f8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mknzq\" (UID: \"4524efd0-030f-4c85-93ee-88a1abfd34f8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.992205 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4524efd0-030f-4c85-93ee-88a1abfd34f8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mknzq\" (UID: \"4524efd0-030f-4c85-93ee-88a1abfd34f8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq" Sep 30 14:07:19 crc kubenswrapper[4936]: I0930 14:07:19.992498 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4524efd0-030f-4c85-93ee-88a1abfd34f8-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mknzq\" (UID: \"4524efd0-030f-4c85-93ee-88a1abfd34f8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq" Sep 30 14:07:20 crc kubenswrapper[4936]: I0930 14:07:20.010696 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgg8m\" (UniqueName: \"kubernetes.io/projected/4524efd0-030f-4c85-93ee-88a1abfd34f8-kube-api-access-mgg8m\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mknzq\" (UID: \"4524efd0-030f-4c85-93ee-88a1abfd34f8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq" Sep 30 14:07:20 crc kubenswrapper[4936]: I0930 14:07:20.106738 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq" Sep 30 14:07:20 crc kubenswrapper[4936]: I0930 14:07:20.654684 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq"] Sep 30 14:07:20 crc kubenswrapper[4936]: I0930 14:07:20.713920 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq" event={"ID":"4524efd0-030f-4c85-93ee-88a1abfd34f8","Type":"ContainerStarted","Data":"b849ad6d5edb17644e470fa99929f6cf69d2df7622ebe258307abc0eee7b58e0"} Sep 30 14:07:21 crc kubenswrapper[4936]: I0930 14:07:21.727185 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq" event={"ID":"4524efd0-030f-4c85-93ee-88a1abfd34f8","Type":"ContainerStarted","Data":"d811d9862ddbe51b97a9262d1798a6f985f8de23f5fda18cbe34d25b51bfe0f4"} Sep 30 14:07:21 crc kubenswrapper[4936]: I0930 14:07:21.755328 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq" podStartSLOduration=2.16028383 podStartE2EDuration="2.755306186s" podCreationTimestamp="2025-09-30 14:07:19 +0000 UTC" firstStartedPulling="2025-09-30 14:07:20.672622429 +0000 UTC m=+1691.056624730" lastFinishedPulling="2025-09-30 14:07:21.267644795 +0000 UTC m=+1691.651647086" observedRunningTime="2025-09-30 14:07:21.748224553 +0000 UTC m=+1692.132226884" watchObservedRunningTime="2025-09-30 14:07:21.755306186 +0000 UTC m=+1692.139308487" Sep 30 14:07:26 crc kubenswrapper[4936]: I0930 14:07:26.315649 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:07:26 crc kubenswrapper[4936]: E0930 14:07:26.316571 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:07:27 crc kubenswrapper[4936]: I0930 14:07:27.780556 4936 generic.go:334] "Generic (PLEG): container finished" podID="4524efd0-030f-4c85-93ee-88a1abfd34f8" containerID="d811d9862ddbe51b97a9262d1798a6f985f8de23f5fda18cbe34d25b51bfe0f4" exitCode=0 Sep 30 14:07:27 crc kubenswrapper[4936]: I0930 14:07:27.780757 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq" event={"ID":"4524efd0-030f-4c85-93ee-88a1abfd34f8","Type":"ContainerDied","Data":"d811d9862ddbe51b97a9262d1798a6f985f8de23f5fda18cbe34d25b51bfe0f4"} Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.256742 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq" Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.273713 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4524efd0-030f-4c85-93ee-88a1abfd34f8-inventory\") pod \"4524efd0-030f-4c85-93ee-88a1abfd34f8\" (UID: \"4524efd0-030f-4c85-93ee-88a1abfd34f8\") " Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.273934 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgg8m\" (UniqueName: \"kubernetes.io/projected/4524efd0-030f-4c85-93ee-88a1abfd34f8-kube-api-access-mgg8m\") pod \"4524efd0-030f-4c85-93ee-88a1abfd34f8\" (UID: \"4524efd0-030f-4c85-93ee-88a1abfd34f8\") " Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.274036 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4524efd0-030f-4c85-93ee-88a1abfd34f8-ssh-key\") pod \"4524efd0-030f-4c85-93ee-88a1abfd34f8\" (UID: \"4524efd0-030f-4c85-93ee-88a1abfd34f8\") " Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.301067 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4524efd0-030f-4c85-93ee-88a1abfd34f8-kube-api-access-mgg8m" (OuterVolumeSpecName: "kube-api-access-mgg8m") pod "4524efd0-030f-4c85-93ee-88a1abfd34f8" (UID: "4524efd0-030f-4c85-93ee-88a1abfd34f8"). InnerVolumeSpecName "kube-api-access-mgg8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.311070 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4524efd0-030f-4c85-93ee-88a1abfd34f8-inventory" (OuterVolumeSpecName: "inventory") pod "4524efd0-030f-4c85-93ee-88a1abfd34f8" (UID: "4524efd0-030f-4c85-93ee-88a1abfd34f8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.311613 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4524efd0-030f-4c85-93ee-88a1abfd34f8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4524efd0-030f-4c85-93ee-88a1abfd34f8" (UID: "4524efd0-030f-4c85-93ee-88a1abfd34f8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.376997 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgg8m\" (UniqueName: \"kubernetes.io/projected/4524efd0-030f-4c85-93ee-88a1abfd34f8-kube-api-access-mgg8m\") on node \"crc\" DevicePath \"\"" Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.377327 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4524efd0-030f-4c85-93ee-88a1abfd34f8-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.377438 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4524efd0-030f-4c85-93ee-88a1abfd34f8-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.801247 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq" event={"ID":"4524efd0-030f-4c85-93ee-88a1abfd34f8","Type":"ContainerDied","Data":"b849ad6d5edb17644e470fa99929f6cf69d2df7622ebe258307abc0eee7b58e0"} Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.801620 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b849ad6d5edb17644e470fa99929f6cf69d2df7622ebe258307abc0eee7b58e0" Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.801519 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq" Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.895242 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99"] Sep 30 14:07:29 crc kubenswrapper[4936]: E0930 14:07:29.895774 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4524efd0-030f-4c85-93ee-88a1abfd34f8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.895802 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="4524efd0-030f-4c85-93ee-88a1abfd34f8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.896023 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="4524efd0-030f-4c85-93ee-88a1abfd34f8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.896884 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99" Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.906529 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.906529 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.906945 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.907117 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.919508 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99"] Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.988604 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8c2a378-8126-4f54-bba8-7c2cb53cd1b9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-scc99\" (UID: \"f8c2a378-8126-4f54-bba8-7c2cb53cd1b9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99" Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.988800 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx769\" (UniqueName: \"kubernetes.io/projected/f8c2a378-8126-4f54-bba8-7c2cb53cd1b9-kube-api-access-lx769\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-scc99\" (UID: \"f8c2a378-8126-4f54-bba8-7c2cb53cd1b9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99" Sep 30 14:07:29 crc kubenswrapper[4936]: I0930 14:07:29.989042 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f8c2a378-8126-4f54-bba8-7c2cb53cd1b9-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-scc99\" (UID: \"f8c2a378-8126-4f54-bba8-7c2cb53cd1b9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99" Sep 30 14:07:30 crc kubenswrapper[4936]: I0930 14:07:30.092655 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8c2a378-8126-4f54-bba8-7c2cb53cd1b9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-scc99\" (UID: \"f8c2a378-8126-4f54-bba8-7c2cb53cd1b9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99" Sep 30 14:07:30 crc kubenswrapper[4936]: I0930 14:07:30.092738 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx769\" (UniqueName: \"kubernetes.io/projected/f8c2a378-8126-4f54-bba8-7c2cb53cd1b9-kube-api-access-lx769\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-scc99\" (UID: \"f8c2a378-8126-4f54-bba8-7c2cb53cd1b9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99" Sep 30 14:07:30 crc kubenswrapper[4936]: I0930 14:07:30.092764 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f8c2a378-8126-4f54-bba8-7c2cb53cd1b9-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-scc99\" (UID: \"f8c2a378-8126-4f54-bba8-7c2cb53cd1b9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99" Sep 30 14:07:30 crc kubenswrapper[4936]: I0930 14:07:30.100285 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8c2a378-8126-4f54-bba8-7c2cb53cd1b9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-scc99\" (UID: \"f8c2a378-8126-4f54-bba8-7c2cb53cd1b9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99" Sep 30 14:07:30 crc kubenswrapper[4936]: I0930 14:07:30.101999 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f8c2a378-8126-4f54-bba8-7c2cb53cd1b9-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-scc99\" (UID: \"f8c2a378-8126-4f54-bba8-7c2cb53cd1b9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99" Sep 30 14:07:30 crc kubenswrapper[4936]: I0930 14:07:30.113259 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx769\" (UniqueName: \"kubernetes.io/projected/f8c2a378-8126-4f54-bba8-7c2cb53cd1b9-kube-api-access-lx769\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-scc99\" (UID: \"f8c2a378-8126-4f54-bba8-7c2cb53cd1b9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99" Sep 30 14:07:30 crc kubenswrapper[4936]: I0930 14:07:30.218293 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99" Sep 30 14:07:30 crc kubenswrapper[4936]: I0930 14:07:30.776776 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99"] Sep 30 14:07:30 crc kubenswrapper[4936]: I0930 14:07:30.820154 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99" event={"ID":"f8c2a378-8126-4f54-bba8-7c2cb53cd1b9","Type":"ContainerStarted","Data":"12ec00a36575333741328f7a0b8f0010b1761ea149074804145f3ab3df6af01e"} Sep 30 14:07:31 crc kubenswrapper[4936]: I0930 14:07:31.832482 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99" event={"ID":"f8c2a378-8126-4f54-bba8-7c2cb53cd1b9","Type":"ContainerStarted","Data":"a79dcd6405a37b1702409d7e4f9c9945d8768888be2001051b9e65b91a147cc3"} Sep 30 14:07:31 crc kubenswrapper[4936]: I0930 14:07:31.863009 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99" podStartSLOduration=2.242792694 podStartE2EDuration="2.862990978s" podCreationTimestamp="2025-09-30 14:07:29 +0000 UTC" firstStartedPulling="2025-09-30 14:07:30.789180924 +0000 UTC m=+1701.173183225" lastFinishedPulling="2025-09-30 14:07:31.409379208 +0000 UTC m=+1701.793381509" observedRunningTime="2025-09-30 14:07:31.85574894 +0000 UTC m=+1702.239751251" watchObservedRunningTime="2025-09-30 14:07:31.862990978 +0000 UTC m=+1702.246993289" Sep 30 14:07:36 crc kubenswrapper[4936]: I0930 14:07:36.747752 4936 scope.go:117] "RemoveContainer" containerID="a42e8b0c3f257fbc5d911478dda460aa3e7e78b70e5afed7356d2fe09a528238" Sep 30 14:07:36 crc kubenswrapper[4936]: I0930 14:07:36.824633 4936 scope.go:117] "RemoveContainer" containerID="59fcefdcde957626ff7aec0338852740946e9f031938675b02b14527b29f749b" Sep 30 14:07:36 crc kubenswrapper[4936]: I0930 14:07:36.856216 4936 scope.go:117] "RemoveContainer" containerID="7480448e26978a986e25400e86a84488371b674d1a69e6cc990e13d8f9283bee" Sep 30 14:07:36 crc kubenswrapper[4936]: I0930 14:07:36.898939 4936 scope.go:117] "RemoveContainer" containerID="e76d4a0c938d00f17d421adc31b7b07ef7fd05d6a389edad54b818cd7c52ad88" Sep 30 14:07:36 crc kubenswrapper[4936]: I0930 14:07:36.966646 4936 scope.go:117] "RemoveContainer" containerID="aa5c4932ba8990f0d13a11a5f8116494282ce66beadf47c3447a670008750592" Sep 30 14:07:37 crc kubenswrapper[4936]: I0930 14:07:37.012108 4936 scope.go:117] "RemoveContainer" containerID="f79c7abb49e439f3adfae0dd557fc8b620d7b2a70b6e0403c3ae39f2a693b317" Sep 30 14:07:37 crc kubenswrapper[4936]: I0930 14:07:37.042349 4936 scope.go:117] "RemoveContainer" containerID="b959c0d179df171e5308d2ca62a7a235d1452bd743dd686f0b9ebb02196478de" Sep 30 14:07:37 crc kubenswrapper[4936]: I0930 14:07:37.064993 4936 scope.go:117] "RemoveContainer" containerID="3b22d180dae995bc282493d1c02e5d074063fdcac50fad430a0fe9541c21aa10" Sep 30 14:07:38 crc kubenswrapper[4936]: I0930 14:07:38.320022 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:07:38 crc kubenswrapper[4936]: E0930 14:07:38.320528 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:07:42 crc kubenswrapper[4936]: I0930 14:07:42.053544 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-gwg87"] Sep 30 14:07:42 crc kubenswrapper[4936]: I0930 14:07:42.064389 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-d9lhn"] Sep 30 14:07:42 crc kubenswrapper[4936]: I0930 14:07:42.073429 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-gwg87"] Sep 30 14:07:42 crc kubenswrapper[4936]: I0930 14:07:42.082295 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-d9lhn"] Sep 30 14:07:42 crc kubenswrapper[4936]: I0930 14:07:42.325660 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="870f7b12-4944-4889-92bc-17f413d6ab36" path="/var/lib/kubelet/pods/870f7b12-4944-4889-92bc-17f413d6ab36/volumes" Sep 30 14:07:42 crc kubenswrapper[4936]: I0930 14:07:42.326494 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dc404db-3484-40e8-8241-35b197b3f120" path="/var/lib/kubelet/pods/9dc404db-3484-40e8-8241-35b197b3f120/volumes" Sep 30 14:07:53 crc kubenswrapper[4936]: I0930 14:07:53.315579 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:07:53 crc kubenswrapper[4936]: E0930 14:07:53.316445 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:07:55 crc kubenswrapper[4936]: I0930 14:07:55.033515 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-zrdqm"] Sep 30 14:07:55 crc kubenswrapper[4936]: I0930 14:07:55.042461 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-zrdqm"] Sep 30 14:07:56 crc kubenswrapper[4936]: I0930 14:07:56.326074 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88e0e7bf-c7d6-4817-a3bc-77189570dfe6" path="/var/lib/kubelet/pods/88e0e7bf-c7d6-4817-a3bc-77189570dfe6/volumes" Sep 30 14:07:58 crc kubenswrapper[4936]: I0930 14:07:58.045609 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-q4sjx"] Sep 30 14:07:58 crc kubenswrapper[4936]: I0930 14:07:58.056081 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-q4sjx"] Sep 30 14:07:58 crc kubenswrapper[4936]: I0930 14:07:58.328514 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b0ae70-9c0c-48af-8ad9-a226c9798c4a" path="/var/lib/kubelet/pods/f4b0ae70-9c0c-48af-8ad9-a226c9798c4a/volumes" Sep 30 14:08:02 crc kubenswrapper[4936]: I0930 14:08:02.043391 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-lgmpr"] Sep 30 14:08:02 crc kubenswrapper[4936]: I0930 14:08:02.056088 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-lgmpr"] Sep 30 14:08:02 crc kubenswrapper[4936]: I0930 14:08:02.328934 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1a55d69-0992-4eb2-a974-f4eedb0bf989" path="/var/lib/kubelet/pods/c1a55d69-0992-4eb2-a974-f4eedb0bf989/volumes" Sep 30 14:08:04 crc kubenswrapper[4936]: I0930 14:08:04.315959 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:08:04 crc kubenswrapper[4936]: E0930 14:08:04.316241 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:08:12 crc kubenswrapper[4936]: I0930 14:08:12.192891 4936 generic.go:334] "Generic (PLEG): container finished" podID="f8c2a378-8126-4f54-bba8-7c2cb53cd1b9" containerID="a79dcd6405a37b1702409d7e4f9c9945d8768888be2001051b9e65b91a147cc3" exitCode=0 Sep 30 14:08:12 crc kubenswrapper[4936]: I0930 14:08:12.193000 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99" event={"ID":"f8c2a378-8126-4f54-bba8-7c2cb53cd1b9","Type":"ContainerDied","Data":"a79dcd6405a37b1702409d7e4f9c9945d8768888be2001051b9e65b91a147cc3"} Sep 30 14:08:13 crc kubenswrapper[4936]: I0930 14:08:13.649596 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99" Sep 30 14:08:13 crc kubenswrapper[4936]: I0930 14:08:13.789434 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f8c2a378-8126-4f54-bba8-7c2cb53cd1b9-ssh-key\") pod \"f8c2a378-8126-4f54-bba8-7c2cb53cd1b9\" (UID: \"f8c2a378-8126-4f54-bba8-7c2cb53cd1b9\") " Sep 30 14:08:13 crc kubenswrapper[4936]: I0930 14:08:13.789573 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8c2a378-8126-4f54-bba8-7c2cb53cd1b9-inventory\") pod \"f8c2a378-8126-4f54-bba8-7c2cb53cd1b9\" (UID: \"f8c2a378-8126-4f54-bba8-7c2cb53cd1b9\") " Sep 30 14:08:13 crc kubenswrapper[4936]: I0930 14:08:13.789665 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx769\" (UniqueName: \"kubernetes.io/projected/f8c2a378-8126-4f54-bba8-7c2cb53cd1b9-kube-api-access-lx769\") pod \"f8c2a378-8126-4f54-bba8-7c2cb53cd1b9\" (UID: \"f8c2a378-8126-4f54-bba8-7c2cb53cd1b9\") " Sep 30 14:08:13 crc kubenswrapper[4936]: I0930 14:08:13.795570 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8c2a378-8126-4f54-bba8-7c2cb53cd1b9-kube-api-access-lx769" (OuterVolumeSpecName: "kube-api-access-lx769") pod "f8c2a378-8126-4f54-bba8-7c2cb53cd1b9" (UID: "f8c2a378-8126-4f54-bba8-7c2cb53cd1b9"). InnerVolumeSpecName "kube-api-access-lx769". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:08:13 crc kubenswrapper[4936]: I0930 14:08:13.816098 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8c2a378-8126-4f54-bba8-7c2cb53cd1b9-inventory" (OuterVolumeSpecName: "inventory") pod "f8c2a378-8126-4f54-bba8-7c2cb53cd1b9" (UID: "f8c2a378-8126-4f54-bba8-7c2cb53cd1b9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:08:13 crc kubenswrapper[4936]: I0930 14:08:13.818467 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8c2a378-8126-4f54-bba8-7c2cb53cd1b9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f8c2a378-8126-4f54-bba8-7c2cb53cd1b9" (UID: "f8c2a378-8126-4f54-bba8-7c2cb53cd1b9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:08:13 crc kubenswrapper[4936]: I0930 14:08:13.892133 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx769\" (UniqueName: \"kubernetes.io/projected/f8c2a378-8126-4f54-bba8-7c2cb53cd1b9-kube-api-access-lx769\") on node \"crc\" DevicePath \"\"" Sep 30 14:08:13 crc kubenswrapper[4936]: I0930 14:08:13.892173 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f8c2a378-8126-4f54-bba8-7c2cb53cd1b9-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:08:13 crc kubenswrapper[4936]: I0930 14:08:13.892185 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8c2a378-8126-4f54-bba8-7c2cb53cd1b9-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.222030 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99" event={"ID":"f8c2a378-8126-4f54-bba8-7c2cb53cd1b9","Type":"ContainerDied","Data":"12ec00a36575333741328f7a0b8f0010b1761ea149074804145f3ab3df6af01e"} Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.222422 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12ec00a36575333741328f7a0b8f0010b1761ea149074804145f3ab3df6af01e" Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.222080 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99" Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.312821 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx"] Sep 30 14:08:14 crc kubenswrapper[4936]: E0930 14:08:14.313518 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8c2a378-8126-4f54-bba8-7c2cb53cd1b9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.313544 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8c2a378-8126-4f54-bba8-7c2cb53cd1b9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.313922 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8c2a378-8126-4f54-bba8-7c2cb53cd1b9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.318281 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx" Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.322313 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.322692 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.322918 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.323286 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.338888 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx"] Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.402275 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/277fa445-5647-4778-9d81-1653da4c6df9-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx\" (UID: \"277fa445-5647-4778-9d81-1653da4c6df9\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx" Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.402711 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzsw5\" (UniqueName: \"kubernetes.io/projected/277fa445-5647-4778-9d81-1653da4c6df9-kube-api-access-fzsw5\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx\" (UID: \"277fa445-5647-4778-9d81-1653da4c6df9\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx" Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.402822 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/277fa445-5647-4778-9d81-1653da4c6df9-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx\" (UID: \"277fa445-5647-4778-9d81-1653da4c6df9\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx" Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.504935 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/277fa445-5647-4778-9d81-1653da4c6df9-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx\" (UID: \"277fa445-5647-4778-9d81-1653da4c6df9\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx" Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.505569 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzsw5\" (UniqueName: \"kubernetes.io/projected/277fa445-5647-4778-9d81-1653da4c6df9-kube-api-access-fzsw5\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx\" (UID: \"277fa445-5647-4778-9d81-1653da4c6df9\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx" Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.506033 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/277fa445-5647-4778-9d81-1653da4c6df9-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx\" (UID: \"277fa445-5647-4778-9d81-1653da4c6df9\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx" Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.511182 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/277fa445-5647-4778-9d81-1653da4c6df9-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx\" (UID: \"277fa445-5647-4778-9d81-1653da4c6df9\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx" Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.513133 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/277fa445-5647-4778-9d81-1653da4c6df9-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx\" (UID: \"277fa445-5647-4778-9d81-1653da4c6df9\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx" Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.524837 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzsw5\" (UniqueName: \"kubernetes.io/projected/277fa445-5647-4778-9d81-1653da4c6df9-kube-api-access-fzsw5\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx\" (UID: \"277fa445-5647-4778-9d81-1653da4c6df9\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx" Sep 30 14:08:14 crc kubenswrapper[4936]: I0930 14:08:14.658848 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx" Sep 30 14:08:15 crc kubenswrapper[4936]: I0930 14:08:15.171063 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx"] Sep 30 14:08:15 crc kubenswrapper[4936]: I0930 14:08:15.232394 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx" event={"ID":"277fa445-5647-4778-9d81-1653da4c6df9","Type":"ContainerStarted","Data":"aeb0286e9672ef5d56538f10f724c87ca52f450106ec0fa685f57560034ec387"} Sep 30 14:08:15 crc kubenswrapper[4936]: I0930 14:08:15.315366 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:08:15 crc kubenswrapper[4936]: E0930 14:08:15.315662 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:08:16 crc kubenswrapper[4936]: I0930 14:08:16.243093 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx" event={"ID":"277fa445-5647-4778-9d81-1653da4c6df9","Type":"ContainerStarted","Data":"1bb3bbb97189ada9bd7fccd62f6da36d599b587a60a3e4d532824006c0a13517"} Sep 30 14:08:16 crc kubenswrapper[4936]: I0930 14:08:16.264796 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx" podStartSLOduration=1.574569027 podStartE2EDuration="2.264772475s" podCreationTimestamp="2025-09-30 14:08:14 +0000 UTC" firstStartedPulling="2025-09-30 14:08:15.18254739 +0000 UTC m=+1745.566549691" lastFinishedPulling="2025-09-30 14:08:15.872750838 +0000 UTC m=+1746.256753139" observedRunningTime="2025-09-30 14:08:16.262093871 +0000 UTC m=+1746.646096172" watchObservedRunningTime="2025-09-30 14:08:16.264772475 +0000 UTC m=+1746.648774776" Sep 30 14:08:21 crc kubenswrapper[4936]: I0930 14:08:21.286612 4936 generic.go:334] "Generic (PLEG): container finished" podID="277fa445-5647-4778-9d81-1653da4c6df9" containerID="1bb3bbb97189ada9bd7fccd62f6da36d599b587a60a3e4d532824006c0a13517" exitCode=0 Sep 30 14:08:21 crc kubenswrapper[4936]: I0930 14:08:21.286732 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx" event={"ID":"277fa445-5647-4778-9d81-1653da4c6df9","Type":"ContainerDied","Data":"1bb3bbb97189ada9bd7fccd62f6da36d599b587a60a3e4d532824006c0a13517"} Sep 30 14:08:22 crc kubenswrapper[4936]: I0930 14:08:22.714536 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx" Sep 30 14:08:22 crc kubenswrapper[4936]: I0930 14:08:22.751518 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/277fa445-5647-4778-9d81-1653da4c6df9-ssh-key\") pod \"277fa445-5647-4778-9d81-1653da4c6df9\" (UID: \"277fa445-5647-4778-9d81-1653da4c6df9\") " Sep 30 14:08:22 crc kubenswrapper[4936]: I0930 14:08:22.751580 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/277fa445-5647-4778-9d81-1653da4c6df9-inventory\") pod \"277fa445-5647-4778-9d81-1653da4c6df9\" (UID: \"277fa445-5647-4778-9d81-1653da4c6df9\") " Sep 30 14:08:22 crc kubenswrapper[4936]: I0930 14:08:22.751670 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzsw5\" (UniqueName: \"kubernetes.io/projected/277fa445-5647-4778-9d81-1653da4c6df9-kube-api-access-fzsw5\") pod \"277fa445-5647-4778-9d81-1653da4c6df9\" (UID: \"277fa445-5647-4778-9d81-1653da4c6df9\") " Sep 30 14:08:22 crc kubenswrapper[4936]: I0930 14:08:22.763799 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/277fa445-5647-4778-9d81-1653da4c6df9-kube-api-access-fzsw5" (OuterVolumeSpecName: "kube-api-access-fzsw5") pod "277fa445-5647-4778-9d81-1653da4c6df9" (UID: "277fa445-5647-4778-9d81-1653da4c6df9"). InnerVolumeSpecName "kube-api-access-fzsw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:08:22 crc kubenswrapper[4936]: I0930 14:08:22.783789 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/277fa445-5647-4778-9d81-1653da4c6df9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "277fa445-5647-4778-9d81-1653da4c6df9" (UID: "277fa445-5647-4778-9d81-1653da4c6df9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:08:22 crc kubenswrapper[4936]: I0930 14:08:22.784247 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/277fa445-5647-4778-9d81-1653da4c6df9-inventory" (OuterVolumeSpecName: "inventory") pod "277fa445-5647-4778-9d81-1653da4c6df9" (UID: "277fa445-5647-4778-9d81-1653da4c6df9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:08:22 crc kubenswrapper[4936]: I0930 14:08:22.853787 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/277fa445-5647-4778-9d81-1653da4c6df9-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:08:22 crc kubenswrapper[4936]: I0930 14:08:22.853837 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/277fa445-5647-4778-9d81-1653da4c6df9-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:08:22 crc kubenswrapper[4936]: I0930 14:08:22.853851 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzsw5\" (UniqueName: \"kubernetes.io/projected/277fa445-5647-4778-9d81-1653da4c6df9-kube-api-access-fzsw5\") on node \"crc\" DevicePath \"\"" Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.305508 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx" event={"ID":"277fa445-5647-4778-9d81-1653da4c6df9","Type":"ContainerDied","Data":"aeb0286e9672ef5d56538f10f724c87ca52f450106ec0fa685f57560034ec387"} Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.305562 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aeb0286e9672ef5d56538f10f724c87ca52f450106ec0fa685f57560034ec387" Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.305565 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx" Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.380969 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9"] Sep 30 14:08:23 crc kubenswrapper[4936]: E0930 14:08:23.381693 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277fa445-5647-4778-9d81-1653da4c6df9" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.381762 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="277fa445-5647-4778-9d81-1653da4c6df9" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.381985 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="277fa445-5647-4778-9d81-1653da4c6df9" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.382727 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9" Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.387839 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.388101 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.388249 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.395108 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9"] Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.438800 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.565616 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4cdff18-db93-4160-80b9-b4589b47756d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9\" (UID: \"f4cdff18-db93-4160-80b9-b4589b47756d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9" Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.566058 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpsdx\" (UniqueName: \"kubernetes.io/projected/f4cdff18-db93-4160-80b9-b4589b47756d-kube-api-access-dpsdx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9\" (UID: \"f4cdff18-db93-4160-80b9-b4589b47756d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9" Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.567302 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4cdff18-db93-4160-80b9-b4589b47756d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9\" (UID: \"f4cdff18-db93-4160-80b9-b4589b47756d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9" Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.669516 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4cdff18-db93-4160-80b9-b4589b47756d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9\" (UID: \"f4cdff18-db93-4160-80b9-b4589b47756d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9" Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.669820 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4cdff18-db93-4160-80b9-b4589b47756d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9\" (UID: \"f4cdff18-db93-4160-80b9-b4589b47756d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9" Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.669949 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpsdx\" (UniqueName: \"kubernetes.io/projected/f4cdff18-db93-4160-80b9-b4589b47756d-kube-api-access-dpsdx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9\" (UID: \"f4cdff18-db93-4160-80b9-b4589b47756d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9" Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.686138 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpsdx\" (UniqueName: \"kubernetes.io/projected/f4cdff18-db93-4160-80b9-b4589b47756d-kube-api-access-dpsdx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9\" (UID: \"f4cdff18-db93-4160-80b9-b4589b47756d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9" Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.833585 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4cdff18-db93-4160-80b9-b4589b47756d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9\" (UID: \"f4cdff18-db93-4160-80b9-b4589b47756d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9" Sep 30 14:08:23 crc kubenswrapper[4936]: I0930 14:08:23.833597 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4cdff18-db93-4160-80b9-b4589b47756d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9\" (UID: \"f4cdff18-db93-4160-80b9-b4589b47756d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9" Sep 30 14:08:24 crc kubenswrapper[4936]: I0930 14:08:24.049055 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9" Sep 30 14:08:24 crc kubenswrapper[4936]: I0930 14:08:24.567519 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9"] Sep 30 14:08:25 crc kubenswrapper[4936]: I0930 14:08:25.325365 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9" event={"ID":"f4cdff18-db93-4160-80b9-b4589b47756d","Type":"ContainerStarted","Data":"23db04606cc0098c5010cdf4aa760a99a7ab78b65ad0d90fa1ade05db2902e6b"} Sep 30 14:08:26 crc kubenswrapper[4936]: I0930 14:08:26.337973 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9" event={"ID":"f4cdff18-db93-4160-80b9-b4589b47756d","Type":"ContainerStarted","Data":"465c8223e0a763ac438627de1cd9f0d4275c567ea504cd60baca168b9e901e39"} Sep 30 14:08:26 crc kubenswrapper[4936]: I0930 14:08:26.360115 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9" podStartSLOduration=2.590213947 podStartE2EDuration="3.360092626s" podCreationTimestamp="2025-09-30 14:08:23 +0000 UTC" firstStartedPulling="2025-09-30 14:08:24.574631864 +0000 UTC m=+1754.958634165" lastFinishedPulling="2025-09-30 14:08:25.344510543 +0000 UTC m=+1755.728512844" observedRunningTime="2025-09-30 14:08:26.359313364 +0000 UTC m=+1756.743315685" watchObservedRunningTime="2025-09-30 14:08:26.360092626 +0000 UTC m=+1756.744094927" Sep 30 14:08:29 crc kubenswrapper[4936]: I0930 14:08:29.322684 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:08:29 crc kubenswrapper[4936]: E0930 14:08:29.323914 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:08:37 crc kubenswrapper[4936]: I0930 14:08:37.239794 4936 scope.go:117] "RemoveContainer" containerID="484841fc4e2d47a267c56d0dc269d056bd589e6bf76c5d44edb85e78197a06c4" Sep 30 14:08:37 crc kubenswrapper[4936]: I0930 14:08:37.317828 4936 scope.go:117] "RemoveContainer" containerID="e0f1eeaedaf8b51b4b84376e35fc277329272693637da3a56a225ba83381af14" Sep 30 14:08:37 crc kubenswrapper[4936]: I0930 14:08:37.365516 4936 scope.go:117] "RemoveContainer" containerID="ac7828e1177620cec3c03e47abca450d452afe663f4eff809ef7c11404f88829" Sep 30 14:08:37 crc kubenswrapper[4936]: I0930 14:08:37.401110 4936 scope.go:117] "RemoveContainer" containerID="61ed5268bcfb5e8028980a859627f7498b97207f38c169f71d935dd3efbde843" Sep 30 14:08:37 crc kubenswrapper[4936]: I0930 14:08:37.457600 4936 scope.go:117] "RemoveContainer" containerID="d4a338b18b03c1738b76b19946a9a2c759f4784c0e1cc7ba79df3b0dd27793a7" Sep 30 14:08:41 crc kubenswrapper[4936]: I0930 14:08:41.316065 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:08:41 crc kubenswrapper[4936]: E0930 14:08:41.316998 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:08:45 crc kubenswrapper[4936]: I0930 14:08:45.055806 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-59tv8"] Sep 30 14:08:45 crc kubenswrapper[4936]: I0930 14:08:45.073220 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-d8pq5"] Sep 30 14:08:45 crc kubenswrapper[4936]: I0930 14:08:45.084382 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-wb9vk"] Sep 30 14:08:45 crc kubenswrapper[4936]: I0930 14:08:45.093391 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-wb9vk"] Sep 30 14:08:45 crc kubenswrapper[4936]: I0930 14:08:45.100976 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-59tv8"] Sep 30 14:08:45 crc kubenswrapper[4936]: I0930 14:08:45.109860 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-d8pq5"] Sep 30 14:08:46 crc kubenswrapper[4936]: I0930 14:08:46.327722 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c5e56b1-fb0d-411b-b8dc-3fe9605ac342" path="/var/lib/kubelet/pods/4c5e56b1-fb0d-411b-b8dc-3fe9605ac342/volumes" Sep 30 14:08:46 crc kubenswrapper[4936]: I0930 14:08:46.328469 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbeaa979-4b15-4193-be6d-3691224ecb0c" path="/var/lib/kubelet/pods/cbeaa979-4b15-4193-be6d-3691224ecb0c/volumes" Sep 30 14:08:46 crc kubenswrapper[4936]: I0930 14:08:46.329169 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3" path="/var/lib/kubelet/pods/f6d66f83-81cf-4f5b-8c24-c1f1dbf6e1b3/volumes" Sep 30 14:08:55 crc kubenswrapper[4936]: I0930 14:08:55.032207 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-4628-account-create-jdb2s"] Sep 30 14:08:55 crc kubenswrapper[4936]: I0930 14:08:55.040263 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-5b87-account-create-zpj8s"] Sep 30 14:08:55 crc kubenswrapper[4936]: I0930 14:08:55.050457 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-05e7-account-create-z5bwm"] Sep 30 14:08:55 crc kubenswrapper[4936]: I0930 14:08:55.056359 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-05e7-account-create-z5bwm"] Sep 30 14:08:55 crc kubenswrapper[4936]: I0930 14:08:55.063666 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-4628-account-create-jdb2s"] Sep 30 14:08:55 crc kubenswrapper[4936]: I0930 14:08:55.073175 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-5b87-account-create-zpj8s"] Sep 30 14:08:56 crc kubenswrapper[4936]: I0930 14:08:56.316249 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:08:56 crc kubenswrapper[4936]: E0930 14:08:56.316670 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:08:56 crc kubenswrapper[4936]: I0930 14:08:56.327842 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1be0320c-fe8d-41de-bcf6-97dc4089ed39" path="/var/lib/kubelet/pods/1be0320c-fe8d-41de-bcf6-97dc4089ed39/volumes" Sep 30 14:08:56 crc kubenswrapper[4936]: I0930 14:08:56.328746 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e87f5ac-ee83-4ba4-b456-f378b93edd80" path="/var/lib/kubelet/pods/5e87f5ac-ee83-4ba4-b456-f378b93edd80/volumes" Sep 30 14:08:56 crc kubenswrapper[4936]: I0930 14:08:56.329431 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90caccbb-1dbf-4ba6-af11-3b21f95535dc" path="/var/lib/kubelet/pods/90caccbb-1dbf-4ba6-af11-3b21f95535dc/volumes" Sep 30 14:09:07 crc kubenswrapper[4936]: I0930 14:09:07.316300 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:09:07 crc kubenswrapper[4936]: E0930 14:09:07.317625 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:09:18 crc kubenswrapper[4936]: I0930 14:09:18.315553 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:09:18 crc kubenswrapper[4936]: E0930 14:09:18.316428 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:09:20 crc kubenswrapper[4936]: I0930 14:09:20.036799 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5xd69"] Sep 30 14:09:20 crc kubenswrapper[4936]: I0930 14:09:20.045262 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5xd69"] Sep 30 14:09:20 crc kubenswrapper[4936]: I0930 14:09:20.327629 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22fda54e-412b-4057-b92b-6b4eb2cde369" path="/var/lib/kubelet/pods/22fda54e-412b-4057-b92b-6b4eb2cde369/volumes" Sep 30 14:09:21 crc kubenswrapper[4936]: I0930 14:09:21.796298 4936 generic.go:334] "Generic (PLEG): container finished" podID="f4cdff18-db93-4160-80b9-b4589b47756d" containerID="465c8223e0a763ac438627de1cd9f0d4275c567ea504cd60baca168b9e901e39" exitCode=2 Sep 30 14:09:21 crc kubenswrapper[4936]: I0930 14:09:21.796608 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9" event={"ID":"f4cdff18-db93-4160-80b9-b4589b47756d","Type":"ContainerDied","Data":"465c8223e0a763ac438627de1cd9f0d4275c567ea504cd60baca168b9e901e39"} Sep 30 14:09:23 crc kubenswrapper[4936]: I0930 14:09:23.206315 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9" Sep 30 14:09:23 crc kubenswrapper[4936]: I0930 14:09:23.291197 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4cdff18-db93-4160-80b9-b4589b47756d-inventory\") pod \"f4cdff18-db93-4160-80b9-b4589b47756d\" (UID: \"f4cdff18-db93-4160-80b9-b4589b47756d\") " Sep 30 14:09:23 crc kubenswrapper[4936]: I0930 14:09:23.291287 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpsdx\" (UniqueName: \"kubernetes.io/projected/f4cdff18-db93-4160-80b9-b4589b47756d-kube-api-access-dpsdx\") pod \"f4cdff18-db93-4160-80b9-b4589b47756d\" (UID: \"f4cdff18-db93-4160-80b9-b4589b47756d\") " Sep 30 14:09:23 crc kubenswrapper[4936]: I0930 14:09:23.291386 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4cdff18-db93-4160-80b9-b4589b47756d-ssh-key\") pod \"f4cdff18-db93-4160-80b9-b4589b47756d\" (UID: \"f4cdff18-db93-4160-80b9-b4589b47756d\") " Sep 30 14:09:23 crc kubenswrapper[4936]: I0930 14:09:23.296526 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4cdff18-db93-4160-80b9-b4589b47756d-kube-api-access-dpsdx" (OuterVolumeSpecName: "kube-api-access-dpsdx") pod "f4cdff18-db93-4160-80b9-b4589b47756d" (UID: "f4cdff18-db93-4160-80b9-b4589b47756d"). InnerVolumeSpecName "kube-api-access-dpsdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:09:23 crc kubenswrapper[4936]: I0930 14:09:23.317546 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4cdff18-db93-4160-80b9-b4589b47756d-inventory" (OuterVolumeSpecName: "inventory") pod "f4cdff18-db93-4160-80b9-b4589b47756d" (UID: "f4cdff18-db93-4160-80b9-b4589b47756d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:09:23 crc kubenswrapper[4936]: I0930 14:09:23.343756 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4cdff18-db93-4160-80b9-b4589b47756d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f4cdff18-db93-4160-80b9-b4589b47756d" (UID: "f4cdff18-db93-4160-80b9-b4589b47756d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:09:23 crc kubenswrapper[4936]: I0930 14:09:23.393796 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4cdff18-db93-4160-80b9-b4589b47756d-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:23 crc kubenswrapper[4936]: I0930 14:09:23.393826 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpsdx\" (UniqueName: \"kubernetes.io/projected/f4cdff18-db93-4160-80b9-b4589b47756d-kube-api-access-dpsdx\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:23 crc kubenswrapper[4936]: I0930 14:09:23.393835 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4cdff18-db93-4160-80b9-b4589b47756d-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:23 crc kubenswrapper[4936]: I0930 14:09:23.814438 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9" event={"ID":"f4cdff18-db93-4160-80b9-b4589b47756d","Type":"ContainerDied","Data":"23db04606cc0098c5010cdf4aa760a99a7ab78b65ad0d90fa1ade05db2902e6b"} Sep 30 14:09:23 crc kubenswrapper[4936]: I0930 14:09:23.814488 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23db04606cc0098c5010cdf4aa760a99a7ab78b65ad0d90fa1ade05db2902e6b" Sep 30 14:09:23 crc kubenswrapper[4936]: I0930 14:09:23.814510 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9" Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.062155 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg"] Sep 30 14:09:31 crc kubenswrapper[4936]: E0930 14:09:31.063166 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4cdff18-db93-4160-80b9-b4589b47756d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.063185 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4cdff18-db93-4160-80b9-b4589b47756d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.063420 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4cdff18-db93-4160-80b9-b4589b47756d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.064150 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg" Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.070993 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.071189 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.071540 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.077618 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.093437 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg"] Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.132170 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqd4p\" (UniqueName: \"kubernetes.io/projected/a7275541-3778-4de4-8ad3-a8fcbf1953d3-kube-api-access-rqd4p\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg\" (UID: \"a7275541-3778-4de4-8ad3-a8fcbf1953d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg" Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.132550 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7275541-3778-4de4-8ad3-a8fcbf1953d3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg\" (UID: \"a7275541-3778-4de4-8ad3-a8fcbf1953d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg" Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.132678 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7275541-3778-4de4-8ad3-a8fcbf1953d3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg\" (UID: \"a7275541-3778-4de4-8ad3-a8fcbf1953d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg" Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.237364 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7275541-3778-4de4-8ad3-a8fcbf1953d3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg\" (UID: \"a7275541-3778-4de4-8ad3-a8fcbf1953d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg" Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.237623 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7275541-3778-4de4-8ad3-a8fcbf1953d3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg\" (UID: \"a7275541-3778-4de4-8ad3-a8fcbf1953d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg" Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.237786 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqd4p\" (UniqueName: \"kubernetes.io/projected/a7275541-3778-4de4-8ad3-a8fcbf1953d3-kube-api-access-rqd4p\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg\" (UID: \"a7275541-3778-4de4-8ad3-a8fcbf1953d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg" Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.249356 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7275541-3778-4de4-8ad3-a8fcbf1953d3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg\" (UID: \"a7275541-3778-4de4-8ad3-a8fcbf1953d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg" Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.261575 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7275541-3778-4de4-8ad3-a8fcbf1953d3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg\" (UID: \"a7275541-3778-4de4-8ad3-a8fcbf1953d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg" Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.261931 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqd4p\" (UniqueName: \"kubernetes.io/projected/a7275541-3778-4de4-8ad3-a8fcbf1953d3-kube-api-access-rqd4p\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg\" (UID: \"a7275541-3778-4de4-8ad3-a8fcbf1953d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg" Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.315546 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:09:31 crc kubenswrapper[4936]: E0930 14:09:31.316369 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.394963 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg" Sep 30 14:09:31 crc kubenswrapper[4936]: I0930 14:09:31.910226 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg"] Sep 30 14:09:32 crc kubenswrapper[4936]: I0930 14:09:32.891654 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg" event={"ID":"a7275541-3778-4de4-8ad3-a8fcbf1953d3","Type":"ContainerStarted","Data":"456a3656623913e788b557456436ef2b2fa5ea50e661612db61c7a82ae1d2dae"} Sep 30 14:09:33 crc kubenswrapper[4936]: I0930 14:09:33.900953 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg" event={"ID":"a7275541-3778-4de4-8ad3-a8fcbf1953d3","Type":"ContainerStarted","Data":"32b61c4c92d318cc0117e285407b16925714e9ff39ee077daf598911b7a1d49d"} Sep 30 14:09:33 crc kubenswrapper[4936]: I0930 14:09:33.919581 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg" podStartSLOduration=1.490118698 podStartE2EDuration="2.919560123s" podCreationTimestamp="2025-09-30 14:09:31 +0000 UTC" firstStartedPulling="2025-09-30 14:09:31.919238831 +0000 UTC m=+1822.303241132" lastFinishedPulling="2025-09-30 14:09:33.348680256 +0000 UTC m=+1823.732682557" observedRunningTime="2025-09-30 14:09:33.915294456 +0000 UTC m=+1824.299296747" watchObservedRunningTime="2025-09-30 14:09:33.919560123 +0000 UTC m=+1824.303562424" Sep 30 14:09:37 crc kubenswrapper[4936]: I0930 14:09:37.658565 4936 scope.go:117] "RemoveContainer" containerID="9ebce1b3502f7a87b1305a833b007f180cd900cc76923316de984e091c10f761" Sep 30 14:09:37 crc kubenswrapper[4936]: I0930 14:09:37.686429 4936 scope.go:117] "RemoveContainer" containerID="6d6ad0923bf9e3994acea602c1802c2f92683e20ba7a5872cf75577f89d9aa14" Sep 30 14:09:37 crc kubenswrapper[4936]: I0930 14:09:37.730520 4936 scope.go:117] "RemoveContainer" containerID="d3bc471e5264a1d1e113b058ffbfc9c96f2b1748aa818ffac0207b57aad2b48d" Sep 30 14:09:37 crc kubenswrapper[4936]: I0930 14:09:37.774615 4936 scope.go:117] "RemoveContainer" containerID="f18ba22d77c1eda91097c3c5d1ec71d70943e850aa81b62c66f87da7dbd174b1" Sep 30 14:09:37 crc kubenswrapper[4936]: I0930 14:09:37.836243 4936 scope.go:117] "RemoveContainer" containerID="2591f592f6078e6eb57d59bfca4a11bc4b410e53954de87773a95924d1f0c8c6" Sep 30 14:09:37 crc kubenswrapper[4936]: I0930 14:09:37.889099 4936 scope.go:117] "RemoveContainer" containerID="e2e114ab60e53fcd60fcb67717152158283014fc519e5d2fb4f16c61af7028bd" Sep 30 14:09:37 crc kubenswrapper[4936]: I0930 14:09:37.942141 4936 scope.go:117] "RemoveContainer" containerID="c994cfd6b7cc5e54c3b2e32b3e0a5a298013d72287be64e7aa3079d59492e5c2" Sep 30 14:09:44 crc kubenswrapper[4936]: I0930 14:09:44.075130 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-j5vlj"] Sep 30 14:09:44 crc kubenswrapper[4936]: I0930 14:09:44.082167 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-j5vlj"] Sep 30 14:09:44 crc kubenswrapper[4936]: I0930 14:09:44.326881 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca906ecd-ab9e-433a-982b-79ca504b6085" path="/var/lib/kubelet/pods/ca906ecd-ab9e-433a-982b-79ca504b6085/volumes" Sep 30 14:09:45 crc kubenswrapper[4936]: I0930 14:09:45.040572 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-n9w76"] Sep 30 14:09:45 crc kubenswrapper[4936]: I0930 14:09:45.063438 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-n9w76"] Sep 30 14:09:46 crc kubenswrapper[4936]: I0930 14:09:46.315194 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:09:46 crc kubenswrapper[4936]: E0930 14:09:46.315754 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:09:46 crc kubenswrapper[4936]: I0930 14:09:46.325322 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe466675-627d-41b4-910f-921c06848e48" path="/var/lib/kubelet/pods/fe466675-627d-41b4-910f-921c06848e48/volumes" Sep 30 14:10:01 crc kubenswrapper[4936]: I0930 14:10:01.316022 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:10:02 crc kubenswrapper[4936]: I0930 14:10:02.142122 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"0553bde46e59815e1b7923e522aa41b591dec050dfe87aa6f12d9ea1207d6552"} Sep 30 14:10:22 crc kubenswrapper[4936]: I0930 14:10:22.307974 4936 generic.go:334] "Generic (PLEG): container finished" podID="a7275541-3778-4de4-8ad3-a8fcbf1953d3" containerID="32b61c4c92d318cc0117e285407b16925714e9ff39ee077daf598911b7a1d49d" exitCode=0 Sep 30 14:10:22 crc kubenswrapper[4936]: I0930 14:10:22.308043 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg" event={"ID":"a7275541-3778-4de4-8ad3-a8fcbf1953d3","Type":"ContainerDied","Data":"32b61c4c92d318cc0117e285407b16925714e9ff39ee077daf598911b7a1d49d"} Sep 30 14:10:23 crc kubenswrapper[4936]: I0930 14:10:23.725099 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg" Sep 30 14:10:23 crc kubenswrapper[4936]: I0930 14:10:23.892099 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7275541-3778-4de4-8ad3-a8fcbf1953d3-inventory\") pod \"a7275541-3778-4de4-8ad3-a8fcbf1953d3\" (UID: \"a7275541-3778-4de4-8ad3-a8fcbf1953d3\") " Sep 30 14:10:23 crc kubenswrapper[4936]: I0930 14:10:23.892233 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqd4p\" (UniqueName: \"kubernetes.io/projected/a7275541-3778-4de4-8ad3-a8fcbf1953d3-kube-api-access-rqd4p\") pod \"a7275541-3778-4de4-8ad3-a8fcbf1953d3\" (UID: \"a7275541-3778-4de4-8ad3-a8fcbf1953d3\") " Sep 30 14:10:23 crc kubenswrapper[4936]: I0930 14:10:23.892372 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7275541-3778-4de4-8ad3-a8fcbf1953d3-ssh-key\") pod \"a7275541-3778-4de4-8ad3-a8fcbf1953d3\" (UID: \"a7275541-3778-4de4-8ad3-a8fcbf1953d3\") " Sep 30 14:10:23 crc kubenswrapper[4936]: I0930 14:10:23.899727 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7275541-3778-4de4-8ad3-a8fcbf1953d3-kube-api-access-rqd4p" (OuterVolumeSpecName: "kube-api-access-rqd4p") pod "a7275541-3778-4de4-8ad3-a8fcbf1953d3" (UID: "a7275541-3778-4de4-8ad3-a8fcbf1953d3"). InnerVolumeSpecName "kube-api-access-rqd4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:10:23 crc kubenswrapper[4936]: I0930 14:10:23.920090 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7275541-3778-4de4-8ad3-a8fcbf1953d3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a7275541-3778-4de4-8ad3-a8fcbf1953d3" (UID: "a7275541-3778-4de4-8ad3-a8fcbf1953d3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:10:23 crc kubenswrapper[4936]: I0930 14:10:23.921905 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7275541-3778-4de4-8ad3-a8fcbf1953d3-inventory" (OuterVolumeSpecName: "inventory") pod "a7275541-3778-4de4-8ad3-a8fcbf1953d3" (UID: "a7275541-3778-4de4-8ad3-a8fcbf1953d3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:10:23 crc kubenswrapper[4936]: I0930 14:10:23.994565 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7275541-3778-4de4-8ad3-a8fcbf1953d3-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:23 crc kubenswrapper[4936]: I0930 14:10:23.994601 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7275541-3778-4de4-8ad3-a8fcbf1953d3-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:23 crc kubenswrapper[4936]: I0930 14:10:23.994612 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqd4p\" (UniqueName: \"kubernetes.io/projected/a7275541-3778-4de4-8ad3-a8fcbf1953d3-kube-api-access-rqd4p\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.344153 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg" event={"ID":"a7275541-3778-4de4-8ad3-a8fcbf1953d3","Type":"ContainerDied","Data":"456a3656623913e788b557456436ef2b2fa5ea50e661612db61c7a82ae1d2dae"} Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.344196 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="456a3656623913e788b557456436ef2b2fa5ea50e661612db61c7a82ae1d2dae" Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.344277 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg" Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.425916 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-b8bcf"] Sep 30 14:10:24 crc kubenswrapper[4936]: E0930 14:10:24.426633 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7275541-3778-4de4-8ad3-a8fcbf1953d3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.426653 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7275541-3778-4de4-8ad3-a8fcbf1953d3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.426897 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7275541-3778-4de4-8ad3-a8fcbf1953d3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.427666 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-b8bcf" Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.431417 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.431767 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.431757 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.431815 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.438504 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-b8bcf"] Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.611194 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpfvk\" (UniqueName: \"kubernetes.io/projected/808224a4-e93e-4ee8-81b6-ad06d7888ad5-kube-api-access-dpfvk\") pod \"ssh-known-hosts-edpm-deployment-b8bcf\" (UID: \"808224a4-e93e-4ee8-81b6-ad06d7888ad5\") " pod="openstack/ssh-known-hosts-edpm-deployment-b8bcf" Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.611667 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/808224a4-e93e-4ee8-81b6-ad06d7888ad5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-b8bcf\" (UID: \"808224a4-e93e-4ee8-81b6-ad06d7888ad5\") " pod="openstack/ssh-known-hosts-edpm-deployment-b8bcf" Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.611788 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/808224a4-e93e-4ee8-81b6-ad06d7888ad5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-b8bcf\" (UID: \"808224a4-e93e-4ee8-81b6-ad06d7888ad5\") " pod="openstack/ssh-known-hosts-edpm-deployment-b8bcf" Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.713826 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpfvk\" (UniqueName: \"kubernetes.io/projected/808224a4-e93e-4ee8-81b6-ad06d7888ad5-kube-api-access-dpfvk\") pod \"ssh-known-hosts-edpm-deployment-b8bcf\" (UID: \"808224a4-e93e-4ee8-81b6-ad06d7888ad5\") " pod="openstack/ssh-known-hosts-edpm-deployment-b8bcf" Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.713933 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/808224a4-e93e-4ee8-81b6-ad06d7888ad5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-b8bcf\" (UID: \"808224a4-e93e-4ee8-81b6-ad06d7888ad5\") " pod="openstack/ssh-known-hosts-edpm-deployment-b8bcf" Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.713997 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/808224a4-e93e-4ee8-81b6-ad06d7888ad5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-b8bcf\" (UID: \"808224a4-e93e-4ee8-81b6-ad06d7888ad5\") " pod="openstack/ssh-known-hosts-edpm-deployment-b8bcf" Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.718064 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/808224a4-e93e-4ee8-81b6-ad06d7888ad5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-b8bcf\" (UID: \"808224a4-e93e-4ee8-81b6-ad06d7888ad5\") " pod="openstack/ssh-known-hosts-edpm-deployment-b8bcf" Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.726970 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/808224a4-e93e-4ee8-81b6-ad06d7888ad5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-b8bcf\" (UID: \"808224a4-e93e-4ee8-81b6-ad06d7888ad5\") " pod="openstack/ssh-known-hosts-edpm-deployment-b8bcf" Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.729996 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpfvk\" (UniqueName: \"kubernetes.io/projected/808224a4-e93e-4ee8-81b6-ad06d7888ad5-kube-api-access-dpfvk\") pod \"ssh-known-hosts-edpm-deployment-b8bcf\" (UID: \"808224a4-e93e-4ee8-81b6-ad06d7888ad5\") " pod="openstack/ssh-known-hosts-edpm-deployment-b8bcf" Sep 30 14:10:24 crc kubenswrapper[4936]: I0930 14:10:24.755774 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-b8bcf" Sep 30 14:10:25 crc kubenswrapper[4936]: I0930 14:10:25.050467 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-5npq5"] Sep 30 14:10:25 crc kubenswrapper[4936]: I0930 14:10:25.060447 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-5npq5"] Sep 30 14:10:25 crc kubenswrapper[4936]: I0930 14:10:25.286825 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-b8bcf"] Sep 30 14:10:25 crc kubenswrapper[4936]: I0930 14:10:25.355266 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-b8bcf" event={"ID":"808224a4-e93e-4ee8-81b6-ad06d7888ad5","Type":"ContainerStarted","Data":"26c2d029d3be1422d6db43b44cb1714937c86318e7933dc15727541d2c03fb13"} Sep 30 14:10:26 crc kubenswrapper[4936]: I0930 14:10:26.329425 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb6eb7d7-1462-435c-b651-7c28fbc5256c" path="/var/lib/kubelet/pods/cb6eb7d7-1462-435c-b651-7c28fbc5256c/volumes" Sep 30 14:10:26 crc kubenswrapper[4936]: I0930 14:10:26.367866 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-b8bcf" event={"ID":"808224a4-e93e-4ee8-81b6-ad06d7888ad5","Type":"ContainerStarted","Data":"8211be7ad09f98644eaec8878ca9cd406fdb543a4a45cf24770a81d076921130"} Sep 30 14:10:26 crc kubenswrapper[4936]: I0930 14:10:26.395394 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-b8bcf" podStartSLOduration=1.857723542 podStartE2EDuration="2.395368172s" podCreationTimestamp="2025-09-30 14:10:24 +0000 UTC" firstStartedPulling="2025-09-30 14:10:25.28412859 +0000 UTC m=+1875.668130881" lastFinishedPulling="2025-09-30 14:10:25.82177321 +0000 UTC m=+1876.205775511" observedRunningTime="2025-09-30 14:10:26.38369674 +0000 UTC m=+1876.767699061" watchObservedRunningTime="2025-09-30 14:10:26.395368172 +0000 UTC m=+1876.779370473" Sep 30 14:10:33 crc kubenswrapper[4936]: I0930 14:10:33.444243 4936 generic.go:334] "Generic (PLEG): container finished" podID="808224a4-e93e-4ee8-81b6-ad06d7888ad5" containerID="8211be7ad09f98644eaec8878ca9cd406fdb543a4a45cf24770a81d076921130" exitCode=0 Sep 30 14:10:33 crc kubenswrapper[4936]: I0930 14:10:33.444327 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-b8bcf" event={"ID":"808224a4-e93e-4ee8-81b6-ad06d7888ad5","Type":"ContainerDied","Data":"8211be7ad09f98644eaec8878ca9cd406fdb543a4a45cf24770a81d076921130"} Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.099139 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-b8bcf" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.206714 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/808224a4-e93e-4ee8-81b6-ad06d7888ad5-ssh-key-openstack-edpm-ipam\") pod \"808224a4-e93e-4ee8-81b6-ad06d7888ad5\" (UID: \"808224a4-e93e-4ee8-81b6-ad06d7888ad5\") " Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.206777 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/808224a4-e93e-4ee8-81b6-ad06d7888ad5-inventory-0\") pod \"808224a4-e93e-4ee8-81b6-ad06d7888ad5\" (UID: \"808224a4-e93e-4ee8-81b6-ad06d7888ad5\") " Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.206844 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpfvk\" (UniqueName: \"kubernetes.io/projected/808224a4-e93e-4ee8-81b6-ad06d7888ad5-kube-api-access-dpfvk\") pod \"808224a4-e93e-4ee8-81b6-ad06d7888ad5\" (UID: \"808224a4-e93e-4ee8-81b6-ad06d7888ad5\") " Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.233911 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/808224a4-e93e-4ee8-81b6-ad06d7888ad5-kube-api-access-dpfvk" (OuterVolumeSpecName: "kube-api-access-dpfvk") pod "808224a4-e93e-4ee8-81b6-ad06d7888ad5" (UID: "808224a4-e93e-4ee8-81b6-ad06d7888ad5"). InnerVolumeSpecName "kube-api-access-dpfvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.242285 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/808224a4-e93e-4ee8-81b6-ad06d7888ad5-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "808224a4-e93e-4ee8-81b6-ad06d7888ad5" (UID: "808224a4-e93e-4ee8-81b6-ad06d7888ad5"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.284270 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/808224a4-e93e-4ee8-81b6-ad06d7888ad5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "808224a4-e93e-4ee8-81b6-ad06d7888ad5" (UID: "808224a4-e93e-4ee8-81b6-ad06d7888ad5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.308562 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/808224a4-e93e-4ee8-81b6-ad06d7888ad5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.308601 4936 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/808224a4-e93e-4ee8-81b6-ad06d7888ad5-inventory-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.308611 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpfvk\" (UniqueName: \"kubernetes.io/projected/808224a4-e93e-4ee8-81b6-ad06d7888ad5-kube-api-access-dpfvk\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.467116 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-b8bcf" event={"ID":"808224a4-e93e-4ee8-81b6-ad06d7888ad5","Type":"ContainerDied","Data":"26c2d029d3be1422d6db43b44cb1714937c86318e7933dc15727541d2c03fb13"} Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.467484 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26c2d029d3be1422d6db43b44cb1714937c86318e7933dc15727541d2c03fb13" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.467620 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-b8bcf" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.558018 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh"] Sep 30 14:10:35 crc kubenswrapper[4936]: E0930 14:10:35.558618 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="808224a4-e93e-4ee8-81b6-ad06d7888ad5" containerName="ssh-known-hosts-edpm-deployment" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.558731 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="808224a4-e93e-4ee8-81b6-ad06d7888ad5" containerName="ssh-known-hosts-edpm-deployment" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.559067 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="808224a4-e93e-4ee8-81b6-ad06d7888ad5" containerName="ssh-known-hosts-edpm-deployment" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.559862 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.564431 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.564654 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.566105 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.570435 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh"] Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.571725 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.713828 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90a18518-5de3-4b3f-87b7-c34e2c4c52b6-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-77hzh\" (UID: \"90a18518-5de3-4b3f-87b7-c34e2c4c52b6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.714023 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl2z8\" (UniqueName: \"kubernetes.io/projected/90a18518-5de3-4b3f-87b7-c34e2c4c52b6-kube-api-access-hl2z8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-77hzh\" (UID: \"90a18518-5de3-4b3f-87b7-c34e2c4c52b6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.714176 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90a18518-5de3-4b3f-87b7-c34e2c4c52b6-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-77hzh\" (UID: \"90a18518-5de3-4b3f-87b7-c34e2c4c52b6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.815642 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90a18518-5de3-4b3f-87b7-c34e2c4c52b6-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-77hzh\" (UID: \"90a18518-5de3-4b3f-87b7-c34e2c4c52b6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.815760 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90a18518-5de3-4b3f-87b7-c34e2c4c52b6-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-77hzh\" (UID: \"90a18518-5de3-4b3f-87b7-c34e2c4c52b6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.815824 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl2z8\" (UniqueName: \"kubernetes.io/projected/90a18518-5de3-4b3f-87b7-c34e2c4c52b6-kube-api-access-hl2z8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-77hzh\" (UID: \"90a18518-5de3-4b3f-87b7-c34e2c4c52b6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.821419 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90a18518-5de3-4b3f-87b7-c34e2c4c52b6-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-77hzh\" (UID: \"90a18518-5de3-4b3f-87b7-c34e2c4c52b6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.821435 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90a18518-5de3-4b3f-87b7-c34e2c4c52b6-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-77hzh\" (UID: \"90a18518-5de3-4b3f-87b7-c34e2c4c52b6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.838464 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl2z8\" (UniqueName: \"kubernetes.io/projected/90a18518-5de3-4b3f-87b7-c34e2c4c52b6-kube-api-access-hl2z8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-77hzh\" (UID: \"90a18518-5de3-4b3f-87b7-c34e2c4c52b6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh" Sep 30 14:10:35 crc kubenswrapper[4936]: I0930 14:10:35.877933 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh" Sep 30 14:10:36 crc kubenswrapper[4936]: I0930 14:10:36.413399 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh"] Sep 30 14:10:36 crc kubenswrapper[4936]: I0930 14:10:36.475592 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh" event={"ID":"90a18518-5de3-4b3f-87b7-c34e2c4c52b6","Type":"ContainerStarted","Data":"d93d7ab301bff4d97ed8cb8b4650e56fc4f99900fb9a9a035d854ad2e7768216"} Sep 30 14:10:37 crc kubenswrapper[4936]: I0930 14:10:37.484965 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh" event={"ID":"90a18518-5de3-4b3f-87b7-c34e2c4c52b6","Type":"ContainerStarted","Data":"3983d129d634053d41003d76b524b501d3713e6a1d127dfea30e78141f0c8ad4"} Sep 30 14:10:37 crc kubenswrapper[4936]: I0930 14:10:37.506472 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh" podStartSLOduration=2.012807501 podStartE2EDuration="2.506447229s" podCreationTimestamp="2025-09-30 14:10:35 +0000 UTC" firstStartedPulling="2025-09-30 14:10:36.415605819 +0000 UTC m=+1886.799608110" lastFinishedPulling="2025-09-30 14:10:36.909245537 +0000 UTC m=+1887.293247838" observedRunningTime="2025-09-30 14:10:37.503322375 +0000 UTC m=+1887.887324676" watchObservedRunningTime="2025-09-30 14:10:37.506447229 +0000 UTC m=+1887.890449530" Sep 30 14:10:38 crc kubenswrapper[4936]: I0930 14:10:38.090274 4936 scope.go:117] "RemoveContainer" containerID="f8a5d50b6dc48579a0b04fefd0890b3e70f3d4e0f3837f4127a0b4d9c9d096c7" Sep 30 14:10:38 crc kubenswrapper[4936]: I0930 14:10:38.154772 4936 scope.go:117] "RemoveContainer" containerID="a345b11302e09c7538e7e71958d7fe5df5b465ae48da0abdada4eb967b37e2e7" Sep 30 14:10:38 crc kubenswrapper[4936]: I0930 14:10:38.199726 4936 scope.go:117] "RemoveContainer" containerID="305017129b05b760bf688c9f231a690a3c5a17e6794fc81a822ccbdd493b299b" Sep 30 14:10:46 crc kubenswrapper[4936]: I0930 14:10:46.581505 4936 generic.go:334] "Generic (PLEG): container finished" podID="90a18518-5de3-4b3f-87b7-c34e2c4c52b6" containerID="3983d129d634053d41003d76b524b501d3713e6a1d127dfea30e78141f0c8ad4" exitCode=0 Sep 30 14:10:46 crc kubenswrapper[4936]: I0930 14:10:46.581570 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh" event={"ID":"90a18518-5de3-4b3f-87b7-c34e2c4c52b6","Type":"ContainerDied","Data":"3983d129d634053d41003d76b524b501d3713e6a1d127dfea30e78141f0c8ad4"} Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.004159 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.153999 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90a18518-5de3-4b3f-87b7-c34e2c4c52b6-inventory\") pod \"90a18518-5de3-4b3f-87b7-c34e2c4c52b6\" (UID: \"90a18518-5de3-4b3f-87b7-c34e2c4c52b6\") " Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.154237 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl2z8\" (UniqueName: \"kubernetes.io/projected/90a18518-5de3-4b3f-87b7-c34e2c4c52b6-kube-api-access-hl2z8\") pod \"90a18518-5de3-4b3f-87b7-c34e2c4c52b6\" (UID: \"90a18518-5de3-4b3f-87b7-c34e2c4c52b6\") " Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.154279 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90a18518-5de3-4b3f-87b7-c34e2c4c52b6-ssh-key\") pod \"90a18518-5de3-4b3f-87b7-c34e2c4c52b6\" (UID: \"90a18518-5de3-4b3f-87b7-c34e2c4c52b6\") " Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.160019 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90a18518-5de3-4b3f-87b7-c34e2c4c52b6-kube-api-access-hl2z8" (OuterVolumeSpecName: "kube-api-access-hl2z8") pod "90a18518-5de3-4b3f-87b7-c34e2c4c52b6" (UID: "90a18518-5de3-4b3f-87b7-c34e2c4c52b6"). InnerVolumeSpecName "kube-api-access-hl2z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.182604 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90a18518-5de3-4b3f-87b7-c34e2c4c52b6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "90a18518-5de3-4b3f-87b7-c34e2c4c52b6" (UID: "90a18518-5de3-4b3f-87b7-c34e2c4c52b6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.185727 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90a18518-5de3-4b3f-87b7-c34e2c4c52b6-inventory" (OuterVolumeSpecName: "inventory") pod "90a18518-5de3-4b3f-87b7-c34e2c4c52b6" (UID: "90a18518-5de3-4b3f-87b7-c34e2c4c52b6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.256694 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90a18518-5de3-4b3f-87b7-c34e2c4c52b6-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.256749 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl2z8\" (UniqueName: \"kubernetes.io/projected/90a18518-5de3-4b3f-87b7-c34e2c4c52b6-kube-api-access-hl2z8\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.256764 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90a18518-5de3-4b3f-87b7-c34e2c4c52b6-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.599298 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh" event={"ID":"90a18518-5de3-4b3f-87b7-c34e2c4c52b6","Type":"ContainerDied","Data":"d93d7ab301bff4d97ed8cb8b4650e56fc4f99900fb9a9a035d854ad2e7768216"} Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.599358 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d93d7ab301bff4d97ed8cb8b4650e56fc4f99900fb9a9a035d854ad2e7768216" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.599429 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.671834 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj"] Sep 30 14:10:48 crc kubenswrapper[4936]: E0930 14:10:48.672198 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90a18518-5de3-4b3f-87b7-c34e2c4c52b6" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.672215 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="90a18518-5de3-4b3f-87b7-c34e2c4c52b6" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.672442 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="90a18518-5de3-4b3f-87b7-c34e2c4c52b6" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.673164 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.675452 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.676655 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.676937 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.677110 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.687149 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj"] Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.765487 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvjqc\" (UniqueName: \"kubernetes.io/projected/20b58aac-6cb3-4b3f-ba54-e036399e7270-kube-api-access-jvjqc\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj\" (UID: \"20b58aac-6cb3-4b3f-ba54-e036399e7270\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.765558 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/20b58aac-6cb3-4b3f-ba54-e036399e7270-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj\" (UID: \"20b58aac-6cb3-4b3f-ba54-e036399e7270\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.765602 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20b58aac-6cb3-4b3f-ba54-e036399e7270-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj\" (UID: \"20b58aac-6cb3-4b3f-ba54-e036399e7270\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.866806 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvjqc\" (UniqueName: \"kubernetes.io/projected/20b58aac-6cb3-4b3f-ba54-e036399e7270-kube-api-access-jvjqc\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj\" (UID: \"20b58aac-6cb3-4b3f-ba54-e036399e7270\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.867085 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/20b58aac-6cb3-4b3f-ba54-e036399e7270-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj\" (UID: \"20b58aac-6cb3-4b3f-ba54-e036399e7270\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.867149 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20b58aac-6cb3-4b3f-ba54-e036399e7270-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj\" (UID: \"20b58aac-6cb3-4b3f-ba54-e036399e7270\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.874422 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20b58aac-6cb3-4b3f-ba54-e036399e7270-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj\" (UID: \"20b58aac-6cb3-4b3f-ba54-e036399e7270\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.874665 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/20b58aac-6cb3-4b3f-ba54-e036399e7270-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj\" (UID: \"20b58aac-6cb3-4b3f-ba54-e036399e7270\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj" Sep 30 14:10:48 crc kubenswrapper[4936]: I0930 14:10:48.888520 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvjqc\" (UniqueName: \"kubernetes.io/projected/20b58aac-6cb3-4b3f-ba54-e036399e7270-kube-api-access-jvjqc\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj\" (UID: \"20b58aac-6cb3-4b3f-ba54-e036399e7270\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj" Sep 30 14:10:49 crc kubenswrapper[4936]: I0930 14:10:49.002100 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj" Sep 30 14:10:49 crc kubenswrapper[4936]: I0930 14:10:49.513262 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj"] Sep 30 14:10:49 crc kubenswrapper[4936]: W0930 14:10:49.518643 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20b58aac_6cb3_4b3f_ba54_e036399e7270.slice/crio-2f44aa438bd19cdcb8d1d78768624cd5e9bcf000b9593034b58907e3670579d1 WatchSource:0}: Error finding container 2f44aa438bd19cdcb8d1d78768624cd5e9bcf000b9593034b58907e3670579d1: Status 404 returned error can't find the container with id 2f44aa438bd19cdcb8d1d78768624cd5e9bcf000b9593034b58907e3670579d1 Sep 30 14:10:49 crc kubenswrapper[4936]: I0930 14:10:49.607997 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj" event={"ID":"20b58aac-6cb3-4b3f-ba54-e036399e7270","Type":"ContainerStarted","Data":"2f44aa438bd19cdcb8d1d78768624cd5e9bcf000b9593034b58907e3670579d1"} Sep 30 14:10:50 crc kubenswrapper[4936]: I0930 14:10:50.619279 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj" event={"ID":"20b58aac-6cb3-4b3f-ba54-e036399e7270","Type":"ContainerStarted","Data":"66bde40905f722d5f8da8e187a0bc98ecd5b2bb6e0557a94acd035411d785c4f"} Sep 30 14:10:50 crc kubenswrapper[4936]: I0930 14:10:50.645472 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj" podStartSLOduration=2.078449114 podStartE2EDuration="2.645453701s" podCreationTimestamp="2025-09-30 14:10:48 +0000 UTC" firstStartedPulling="2025-09-30 14:10:49.52123688 +0000 UTC m=+1899.905239181" lastFinishedPulling="2025-09-30 14:10:50.088241467 +0000 UTC m=+1900.472243768" observedRunningTime="2025-09-30 14:10:50.637210654 +0000 UTC m=+1901.021212955" watchObservedRunningTime="2025-09-30 14:10:50.645453701 +0000 UTC m=+1901.029456002" Sep 30 14:11:00 crc kubenswrapper[4936]: I0930 14:11:00.720925 4936 generic.go:334] "Generic (PLEG): container finished" podID="20b58aac-6cb3-4b3f-ba54-e036399e7270" containerID="66bde40905f722d5f8da8e187a0bc98ecd5b2bb6e0557a94acd035411d785c4f" exitCode=0 Sep 30 14:11:00 crc kubenswrapper[4936]: I0930 14:11:00.721955 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj" event={"ID":"20b58aac-6cb3-4b3f-ba54-e036399e7270","Type":"ContainerDied","Data":"66bde40905f722d5f8da8e187a0bc98ecd5b2bb6e0557a94acd035411d785c4f"} Sep 30 14:11:02 crc kubenswrapper[4936]: I0930 14:11:02.155761 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj" Sep 30 14:11:02 crc kubenswrapper[4936]: I0930 14:11:02.212092 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvjqc\" (UniqueName: \"kubernetes.io/projected/20b58aac-6cb3-4b3f-ba54-e036399e7270-kube-api-access-jvjqc\") pod \"20b58aac-6cb3-4b3f-ba54-e036399e7270\" (UID: \"20b58aac-6cb3-4b3f-ba54-e036399e7270\") " Sep 30 14:11:02 crc kubenswrapper[4936]: I0930 14:11:02.212155 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/20b58aac-6cb3-4b3f-ba54-e036399e7270-ssh-key\") pod \"20b58aac-6cb3-4b3f-ba54-e036399e7270\" (UID: \"20b58aac-6cb3-4b3f-ba54-e036399e7270\") " Sep 30 14:11:02 crc kubenswrapper[4936]: I0930 14:11:02.212237 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20b58aac-6cb3-4b3f-ba54-e036399e7270-inventory\") pod \"20b58aac-6cb3-4b3f-ba54-e036399e7270\" (UID: \"20b58aac-6cb3-4b3f-ba54-e036399e7270\") " Sep 30 14:11:02 crc kubenswrapper[4936]: I0930 14:11:02.218217 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b58aac-6cb3-4b3f-ba54-e036399e7270-kube-api-access-jvjqc" (OuterVolumeSpecName: "kube-api-access-jvjqc") pod "20b58aac-6cb3-4b3f-ba54-e036399e7270" (UID: "20b58aac-6cb3-4b3f-ba54-e036399e7270"). InnerVolumeSpecName "kube-api-access-jvjqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:11:02 crc kubenswrapper[4936]: I0930 14:11:02.239316 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b58aac-6cb3-4b3f-ba54-e036399e7270-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "20b58aac-6cb3-4b3f-ba54-e036399e7270" (UID: "20b58aac-6cb3-4b3f-ba54-e036399e7270"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:11:02 crc kubenswrapper[4936]: I0930 14:11:02.240716 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b58aac-6cb3-4b3f-ba54-e036399e7270-inventory" (OuterVolumeSpecName: "inventory") pod "20b58aac-6cb3-4b3f-ba54-e036399e7270" (UID: "20b58aac-6cb3-4b3f-ba54-e036399e7270"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:11:02 crc kubenswrapper[4936]: I0930 14:11:02.315463 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvjqc\" (UniqueName: \"kubernetes.io/projected/20b58aac-6cb3-4b3f-ba54-e036399e7270-kube-api-access-jvjqc\") on node \"crc\" DevicePath \"\"" Sep 30 14:11:02 crc kubenswrapper[4936]: I0930 14:11:02.315501 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/20b58aac-6cb3-4b3f-ba54-e036399e7270-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:11:02 crc kubenswrapper[4936]: I0930 14:11:02.315514 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20b58aac-6cb3-4b3f-ba54-e036399e7270-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:11:02 crc kubenswrapper[4936]: I0930 14:11:02.741919 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj" event={"ID":"20b58aac-6cb3-4b3f-ba54-e036399e7270","Type":"ContainerDied","Data":"2f44aa438bd19cdcb8d1d78768624cd5e9bcf000b9593034b58907e3670579d1"} Sep 30 14:11:02 crc kubenswrapper[4936]: I0930 14:11:02.741977 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f44aa438bd19cdcb8d1d78768624cd5e9bcf000b9593034b58907e3670579d1" Sep 30 14:11:02 crc kubenswrapper[4936]: I0930 14:11:02.741973 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj" Sep 30 14:12:18 crc kubenswrapper[4936]: I0930 14:12:18.250420 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:12:18 crc kubenswrapper[4936]: I0930 14:12:18.251469 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:12:48 crc kubenswrapper[4936]: I0930 14:12:48.250415 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:12:48 crc kubenswrapper[4936]: I0930 14:12:48.251538 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:12:57 crc kubenswrapper[4936]: I0930 14:12:57.687556 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7rr2n"] Sep 30 14:12:57 crc kubenswrapper[4936]: E0930 14:12:57.689431 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b58aac-6cb3-4b3f-ba54-e036399e7270" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:12:57 crc kubenswrapper[4936]: I0930 14:12:57.689542 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b58aac-6cb3-4b3f-ba54-e036399e7270" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:12:57 crc kubenswrapper[4936]: I0930 14:12:57.689851 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b58aac-6cb3-4b3f-ba54-e036399e7270" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:12:57 crc kubenswrapper[4936]: I0930 14:12:57.691627 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7rr2n" Sep 30 14:12:57 crc kubenswrapper[4936]: I0930 14:12:57.708134 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7rr2n"] Sep 30 14:12:57 crc kubenswrapper[4936]: I0930 14:12:57.837436 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16de76f7-91cc-4e68-81a4-924353a0c31e-utilities\") pod \"community-operators-7rr2n\" (UID: \"16de76f7-91cc-4e68-81a4-924353a0c31e\") " pod="openshift-marketplace/community-operators-7rr2n" Sep 30 14:12:57 crc kubenswrapper[4936]: I0930 14:12:57.837747 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f47sl\" (UniqueName: \"kubernetes.io/projected/16de76f7-91cc-4e68-81a4-924353a0c31e-kube-api-access-f47sl\") pod \"community-operators-7rr2n\" (UID: \"16de76f7-91cc-4e68-81a4-924353a0c31e\") " pod="openshift-marketplace/community-operators-7rr2n" Sep 30 14:12:57 crc kubenswrapper[4936]: I0930 14:12:57.837913 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16de76f7-91cc-4e68-81a4-924353a0c31e-catalog-content\") pod \"community-operators-7rr2n\" (UID: \"16de76f7-91cc-4e68-81a4-924353a0c31e\") " pod="openshift-marketplace/community-operators-7rr2n" Sep 30 14:12:57 crc kubenswrapper[4936]: I0930 14:12:57.956755 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16de76f7-91cc-4e68-81a4-924353a0c31e-utilities\") pod \"community-operators-7rr2n\" (UID: \"16de76f7-91cc-4e68-81a4-924353a0c31e\") " pod="openshift-marketplace/community-operators-7rr2n" Sep 30 14:12:57 crc kubenswrapper[4936]: I0930 14:12:57.957125 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f47sl\" (UniqueName: \"kubernetes.io/projected/16de76f7-91cc-4e68-81a4-924353a0c31e-kube-api-access-f47sl\") pod \"community-operators-7rr2n\" (UID: \"16de76f7-91cc-4e68-81a4-924353a0c31e\") " pod="openshift-marketplace/community-operators-7rr2n" Sep 30 14:12:57 crc kubenswrapper[4936]: I0930 14:12:57.957232 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16de76f7-91cc-4e68-81a4-924353a0c31e-catalog-content\") pod \"community-operators-7rr2n\" (UID: \"16de76f7-91cc-4e68-81a4-924353a0c31e\") " pod="openshift-marketplace/community-operators-7rr2n" Sep 30 14:12:57 crc kubenswrapper[4936]: I0930 14:12:57.957440 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16de76f7-91cc-4e68-81a4-924353a0c31e-utilities\") pod \"community-operators-7rr2n\" (UID: \"16de76f7-91cc-4e68-81a4-924353a0c31e\") " pod="openshift-marketplace/community-operators-7rr2n" Sep 30 14:12:57 crc kubenswrapper[4936]: I0930 14:12:57.957557 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16de76f7-91cc-4e68-81a4-924353a0c31e-catalog-content\") pod \"community-operators-7rr2n\" (UID: \"16de76f7-91cc-4e68-81a4-924353a0c31e\") " pod="openshift-marketplace/community-operators-7rr2n" Sep 30 14:12:57 crc kubenswrapper[4936]: I0930 14:12:57.977141 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f47sl\" (UniqueName: \"kubernetes.io/projected/16de76f7-91cc-4e68-81a4-924353a0c31e-kube-api-access-f47sl\") pod \"community-operators-7rr2n\" (UID: \"16de76f7-91cc-4e68-81a4-924353a0c31e\") " pod="openshift-marketplace/community-operators-7rr2n" Sep 30 14:12:58 crc kubenswrapper[4936]: I0930 14:12:58.015156 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7rr2n" Sep 30 14:12:58 crc kubenswrapper[4936]: I0930 14:12:58.626993 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7rr2n"] Sep 30 14:12:58 crc kubenswrapper[4936]: I0930 14:12:58.736020 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rr2n" event={"ID":"16de76f7-91cc-4e68-81a4-924353a0c31e","Type":"ContainerStarted","Data":"9e7b994e90b2827a2e9b4a66dd3a8b42f0d839be489ba9973ae5505204b86caf"} Sep 30 14:12:59 crc kubenswrapper[4936]: I0930 14:12:59.745064 4936 generic.go:334] "Generic (PLEG): container finished" podID="16de76f7-91cc-4e68-81a4-924353a0c31e" containerID="25f46348677e4d5df4c877c1a892fd3b799fde390ed18471e89b08958a8d7095" exitCode=0 Sep 30 14:12:59 crc kubenswrapper[4936]: I0930 14:12:59.745155 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rr2n" event={"ID":"16de76f7-91cc-4e68-81a4-924353a0c31e","Type":"ContainerDied","Data":"25f46348677e4d5df4c877c1a892fd3b799fde390ed18471e89b08958a8d7095"} Sep 30 14:12:59 crc kubenswrapper[4936]: I0930 14:12:59.747329 4936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:13:01 crc kubenswrapper[4936]: I0930 14:13:01.762612 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rr2n" event={"ID":"16de76f7-91cc-4e68-81a4-924353a0c31e","Type":"ContainerStarted","Data":"f67f1512ada65caf1567933a72b984f23f0e6c994ea8fffd406d43fe64b8aea0"} Sep 30 14:13:02 crc kubenswrapper[4936]: I0930 14:13:02.773184 4936 generic.go:334] "Generic (PLEG): container finished" podID="16de76f7-91cc-4e68-81a4-924353a0c31e" containerID="f67f1512ada65caf1567933a72b984f23f0e6c994ea8fffd406d43fe64b8aea0" exitCode=0 Sep 30 14:13:02 crc kubenswrapper[4936]: I0930 14:13:02.773236 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rr2n" event={"ID":"16de76f7-91cc-4e68-81a4-924353a0c31e","Type":"ContainerDied","Data":"f67f1512ada65caf1567933a72b984f23f0e6c994ea8fffd406d43fe64b8aea0"} Sep 30 14:13:03 crc kubenswrapper[4936]: I0930 14:13:03.782804 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rr2n" event={"ID":"16de76f7-91cc-4e68-81a4-924353a0c31e","Type":"ContainerStarted","Data":"e7a147e11e3eb951b0c126172381b7ef1f36c17f6649d2a3374dec0aec224541"} Sep 30 14:13:03 crc kubenswrapper[4936]: I0930 14:13:03.801813 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7rr2n" podStartSLOduration=3.326185438 podStartE2EDuration="6.801795934s" podCreationTimestamp="2025-09-30 14:12:57 +0000 UTC" firstStartedPulling="2025-09-30 14:12:59.747080878 +0000 UTC m=+2030.131083179" lastFinishedPulling="2025-09-30 14:13:03.222691374 +0000 UTC m=+2033.606693675" observedRunningTime="2025-09-30 14:13:03.799161289 +0000 UTC m=+2034.183163600" watchObservedRunningTime="2025-09-30 14:13:03.801795934 +0000 UTC m=+2034.185798235" Sep 30 14:13:08 crc kubenswrapper[4936]: I0930 14:13:08.015764 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7rr2n" Sep 30 14:13:08 crc kubenswrapper[4936]: I0930 14:13:08.016395 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7rr2n" Sep 30 14:13:08 crc kubenswrapper[4936]: I0930 14:13:08.066903 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7rr2n" Sep 30 14:13:08 crc kubenswrapper[4936]: I0930 14:13:08.867113 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7rr2n" Sep 30 14:13:08 crc kubenswrapper[4936]: I0930 14:13:08.924624 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7rr2n"] Sep 30 14:13:10 crc kubenswrapper[4936]: I0930 14:13:10.838081 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7rr2n" podUID="16de76f7-91cc-4e68-81a4-924353a0c31e" containerName="registry-server" containerID="cri-o://e7a147e11e3eb951b0c126172381b7ef1f36c17f6649d2a3374dec0aec224541" gracePeriod=2 Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.292422 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7rr2n" Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.305037 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f47sl\" (UniqueName: \"kubernetes.io/projected/16de76f7-91cc-4e68-81a4-924353a0c31e-kube-api-access-f47sl\") pod \"16de76f7-91cc-4e68-81a4-924353a0c31e\" (UID: \"16de76f7-91cc-4e68-81a4-924353a0c31e\") " Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.305280 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16de76f7-91cc-4e68-81a4-924353a0c31e-utilities\") pod \"16de76f7-91cc-4e68-81a4-924353a0c31e\" (UID: \"16de76f7-91cc-4e68-81a4-924353a0c31e\") " Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.305322 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16de76f7-91cc-4e68-81a4-924353a0c31e-catalog-content\") pod \"16de76f7-91cc-4e68-81a4-924353a0c31e\" (UID: \"16de76f7-91cc-4e68-81a4-924353a0c31e\") " Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.306287 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16de76f7-91cc-4e68-81a4-924353a0c31e-utilities" (OuterVolumeSpecName: "utilities") pod "16de76f7-91cc-4e68-81a4-924353a0c31e" (UID: "16de76f7-91cc-4e68-81a4-924353a0c31e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.307118 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16de76f7-91cc-4e68-81a4-924353a0c31e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.317128 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16de76f7-91cc-4e68-81a4-924353a0c31e-kube-api-access-f47sl" (OuterVolumeSpecName: "kube-api-access-f47sl") pod "16de76f7-91cc-4e68-81a4-924353a0c31e" (UID: "16de76f7-91cc-4e68-81a4-924353a0c31e"). InnerVolumeSpecName "kube-api-access-f47sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.356056 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16de76f7-91cc-4e68-81a4-924353a0c31e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16de76f7-91cc-4e68-81a4-924353a0c31e" (UID: "16de76f7-91cc-4e68-81a4-924353a0c31e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.409210 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16de76f7-91cc-4e68-81a4-924353a0c31e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.409256 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f47sl\" (UniqueName: \"kubernetes.io/projected/16de76f7-91cc-4e68-81a4-924353a0c31e-kube-api-access-f47sl\") on node \"crc\" DevicePath \"\"" Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.849440 4936 generic.go:334] "Generic (PLEG): container finished" podID="16de76f7-91cc-4e68-81a4-924353a0c31e" containerID="e7a147e11e3eb951b0c126172381b7ef1f36c17f6649d2a3374dec0aec224541" exitCode=0 Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.849494 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rr2n" event={"ID":"16de76f7-91cc-4e68-81a4-924353a0c31e","Type":"ContainerDied","Data":"e7a147e11e3eb951b0c126172381b7ef1f36c17f6649d2a3374dec0aec224541"} Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.849540 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rr2n" event={"ID":"16de76f7-91cc-4e68-81a4-924353a0c31e","Type":"ContainerDied","Data":"9e7b994e90b2827a2e9b4a66dd3a8b42f0d839be489ba9973ae5505204b86caf"} Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.849563 4936 scope.go:117] "RemoveContainer" containerID="e7a147e11e3eb951b0c126172381b7ef1f36c17f6649d2a3374dec0aec224541" Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.850715 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7rr2n" Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.873054 4936 scope.go:117] "RemoveContainer" containerID="f67f1512ada65caf1567933a72b984f23f0e6c994ea8fffd406d43fe64b8aea0" Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.891626 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7rr2n"] Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.898785 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7rr2n"] Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.906683 4936 scope.go:117] "RemoveContainer" containerID="25f46348677e4d5df4c877c1a892fd3b799fde390ed18471e89b08958a8d7095" Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.942771 4936 scope.go:117] "RemoveContainer" containerID="e7a147e11e3eb951b0c126172381b7ef1f36c17f6649d2a3374dec0aec224541" Sep 30 14:13:11 crc kubenswrapper[4936]: E0930 14:13:11.944968 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7a147e11e3eb951b0c126172381b7ef1f36c17f6649d2a3374dec0aec224541\": container with ID starting with e7a147e11e3eb951b0c126172381b7ef1f36c17f6649d2a3374dec0aec224541 not found: ID does not exist" containerID="e7a147e11e3eb951b0c126172381b7ef1f36c17f6649d2a3374dec0aec224541" Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.945073 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7a147e11e3eb951b0c126172381b7ef1f36c17f6649d2a3374dec0aec224541"} err="failed to get container status \"e7a147e11e3eb951b0c126172381b7ef1f36c17f6649d2a3374dec0aec224541\": rpc error: code = NotFound desc = could not find container \"e7a147e11e3eb951b0c126172381b7ef1f36c17f6649d2a3374dec0aec224541\": container with ID starting with e7a147e11e3eb951b0c126172381b7ef1f36c17f6649d2a3374dec0aec224541 not found: ID does not exist" Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.945151 4936 scope.go:117] "RemoveContainer" containerID="f67f1512ada65caf1567933a72b984f23f0e6c994ea8fffd406d43fe64b8aea0" Sep 30 14:13:11 crc kubenswrapper[4936]: E0930 14:13:11.945706 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f67f1512ada65caf1567933a72b984f23f0e6c994ea8fffd406d43fe64b8aea0\": container with ID starting with f67f1512ada65caf1567933a72b984f23f0e6c994ea8fffd406d43fe64b8aea0 not found: ID does not exist" containerID="f67f1512ada65caf1567933a72b984f23f0e6c994ea8fffd406d43fe64b8aea0" Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.945839 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f67f1512ada65caf1567933a72b984f23f0e6c994ea8fffd406d43fe64b8aea0"} err="failed to get container status \"f67f1512ada65caf1567933a72b984f23f0e6c994ea8fffd406d43fe64b8aea0\": rpc error: code = NotFound desc = could not find container \"f67f1512ada65caf1567933a72b984f23f0e6c994ea8fffd406d43fe64b8aea0\": container with ID starting with f67f1512ada65caf1567933a72b984f23f0e6c994ea8fffd406d43fe64b8aea0 not found: ID does not exist" Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.945991 4936 scope.go:117] "RemoveContainer" containerID="25f46348677e4d5df4c877c1a892fd3b799fde390ed18471e89b08958a8d7095" Sep 30 14:13:11 crc kubenswrapper[4936]: E0930 14:13:11.946508 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25f46348677e4d5df4c877c1a892fd3b799fde390ed18471e89b08958a8d7095\": container with ID starting with 25f46348677e4d5df4c877c1a892fd3b799fde390ed18471e89b08958a8d7095 not found: ID does not exist" containerID="25f46348677e4d5df4c877c1a892fd3b799fde390ed18471e89b08958a8d7095" Sep 30 14:13:11 crc kubenswrapper[4936]: I0930 14:13:11.946657 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f46348677e4d5df4c877c1a892fd3b799fde390ed18471e89b08958a8d7095"} err="failed to get container status \"25f46348677e4d5df4c877c1a892fd3b799fde390ed18471e89b08958a8d7095\": rpc error: code = NotFound desc = could not find container \"25f46348677e4d5df4c877c1a892fd3b799fde390ed18471e89b08958a8d7095\": container with ID starting with 25f46348677e4d5df4c877c1a892fd3b799fde390ed18471e89b08958a8d7095 not found: ID does not exist" Sep 30 14:13:12 crc kubenswrapper[4936]: I0930 14:13:12.328356 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16de76f7-91cc-4e68-81a4-924353a0c31e" path="/var/lib/kubelet/pods/16de76f7-91cc-4e68-81a4-924353a0c31e/volumes" Sep 30 14:13:18 crc kubenswrapper[4936]: I0930 14:13:18.250491 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:13:18 crc kubenswrapper[4936]: I0930 14:13:18.251516 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:13:18 crc kubenswrapper[4936]: I0930 14:13:18.251586 4936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 14:13:18 crc kubenswrapper[4936]: I0930 14:13:18.252288 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0553bde46e59815e1b7923e522aa41b591dec050dfe87aa6f12d9ea1207d6552"} pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:13:18 crc kubenswrapper[4936]: I0930 14:13:18.252360 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" containerID="cri-o://0553bde46e59815e1b7923e522aa41b591dec050dfe87aa6f12d9ea1207d6552" gracePeriod=600 Sep 30 14:13:18 crc kubenswrapper[4936]: I0930 14:13:18.929110 4936 generic.go:334] "Generic (PLEG): container finished" podID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerID="0553bde46e59815e1b7923e522aa41b591dec050dfe87aa6f12d9ea1207d6552" exitCode=0 Sep 30 14:13:18 crc kubenswrapper[4936]: I0930 14:13:18.929224 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerDied","Data":"0553bde46e59815e1b7923e522aa41b591dec050dfe87aa6f12d9ea1207d6552"} Sep 30 14:13:18 crc kubenswrapper[4936]: I0930 14:13:18.929474 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a"} Sep 30 14:13:18 crc kubenswrapper[4936]: I0930 14:13:18.929498 4936 scope.go:117] "RemoveContainer" containerID="6edebb5e41b55790bce5fcbb1b41ce0592e36da736418a2ba0b204a13a8b9707" Sep 30 14:13:44 crc kubenswrapper[4936]: I0930 14:13:44.156193 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k8t8h"] Sep 30 14:13:44 crc kubenswrapper[4936]: E0930 14:13:44.157240 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16de76f7-91cc-4e68-81a4-924353a0c31e" containerName="extract-utilities" Sep 30 14:13:44 crc kubenswrapper[4936]: I0930 14:13:44.157261 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="16de76f7-91cc-4e68-81a4-924353a0c31e" containerName="extract-utilities" Sep 30 14:13:44 crc kubenswrapper[4936]: E0930 14:13:44.157281 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16de76f7-91cc-4e68-81a4-924353a0c31e" containerName="registry-server" Sep 30 14:13:44 crc kubenswrapper[4936]: I0930 14:13:44.157289 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="16de76f7-91cc-4e68-81a4-924353a0c31e" containerName="registry-server" Sep 30 14:13:44 crc kubenswrapper[4936]: E0930 14:13:44.157305 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16de76f7-91cc-4e68-81a4-924353a0c31e" containerName="extract-content" Sep 30 14:13:44 crc kubenswrapper[4936]: I0930 14:13:44.157330 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="16de76f7-91cc-4e68-81a4-924353a0c31e" containerName="extract-content" Sep 30 14:13:44 crc kubenswrapper[4936]: I0930 14:13:44.157636 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="16de76f7-91cc-4e68-81a4-924353a0c31e" containerName="registry-server" Sep 30 14:13:44 crc kubenswrapper[4936]: I0930 14:13:44.159191 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k8t8h" Sep 30 14:13:44 crc kubenswrapper[4936]: I0930 14:13:44.167153 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k8t8h"] Sep 30 14:13:44 crc kubenswrapper[4936]: I0930 14:13:44.278695 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d779fd94-9fb2-4bd0-95c3-d7ac8885b589-utilities\") pod \"redhat-operators-k8t8h\" (UID: \"d779fd94-9fb2-4bd0-95c3-d7ac8885b589\") " pod="openshift-marketplace/redhat-operators-k8t8h" Sep 30 14:13:44 crc kubenswrapper[4936]: I0930 14:13:44.278734 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d779fd94-9fb2-4bd0-95c3-d7ac8885b589-catalog-content\") pod \"redhat-operators-k8t8h\" (UID: \"d779fd94-9fb2-4bd0-95c3-d7ac8885b589\") " pod="openshift-marketplace/redhat-operators-k8t8h" Sep 30 14:13:44 crc kubenswrapper[4936]: I0930 14:13:44.278767 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsl7g\" (UniqueName: \"kubernetes.io/projected/d779fd94-9fb2-4bd0-95c3-d7ac8885b589-kube-api-access-dsl7g\") pod \"redhat-operators-k8t8h\" (UID: \"d779fd94-9fb2-4bd0-95c3-d7ac8885b589\") " pod="openshift-marketplace/redhat-operators-k8t8h" Sep 30 14:13:44 crc kubenswrapper[4936]: I0930 14:13:44.381022 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d779fd94-9fb2-4bd0-95c3-d7ac8885b589-utilities\") pod \"redhat-operators-k8t8h\" (UID: \"d779fd94-9fb2-4bd0-95c3-d7ac8885b589\") " pod="openshift-marketplace/redhat-operators-k8t8h" Sep 30 14:13:44 crc kubenswrapper[4936]: I0930 14:13:44.381080 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d779fd94-9fb2-4bd0-95c3-d7ac8885b589-catalog-content\") pod \"redhat-operators-k8t8h\" (UID: \"d779fd94-9fb2-4bd0-95c3-d7ac8885b589\") " pod="openshift-marketplace/redhat-operators-k8t8h" Sep 30 14:13:44 crc kubenswrapper[4936]: I0930 14:13:44.381116 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsl7g\" (UniqueName: \"kubernetes.io/projected/d779fd94-9fb2-4bd0-95c3-d7ac8885b589-kube-api-access-dsl7g\") pod \"redhat-operators-k8t8h\" (UID: \"d779fd94-9fb2-4bd0-95c3-d7ac8885b589\") " pod="openshift-marketplace/redhat-operators-k8t8h" Sep 30 14:13:44 crc kubenswrapper[4936]: I0930 14:13:44.381691 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d779fd94-9fb2-4bd0-95c3-d7ac8885b589-utilities\") pod \"redhat-operators-k8t8h\" (UID: \"d779fd94-9fb2-4bd0-95c3-d7ac8885b589\") " pod="openshift-marketplace/redhat-operators-k8t8h" Sep 30 14:13:44 crc kubenswrapper[4936]: I0930 14:13:44.381753 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d779fd94-9fb2-4bd0-95c3-d7ac8885b589-catalog-content\") pod \"redhat-operators-k8t8h\" (UID: \"d779fd94-9fb2-4bd0-95c3-d7ac8885b589\") " pod="openshift-marketplace/redhat-operators-k8t8h" Sep 30 14:13:44 crc kubenswrapper[4936]: I0930 14:13:44.412275 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsl7g\" (UniqueName: \"kubernetes.io/projected/d779fd94-9fb2-4bd0-95c3-d7ac8885b589-kube-api-access-dsl7g\") pod \"redhat-operators-k8t8h\" (UID: \"d779fd94-9fb2-4bd0-95c3-d7ac8885b589\") " pod="openshift-marketplace/redhat-operators-k8t8h" Sep 30 14:13:44 crc kubenswrapper[4936]: I0930 14:13:44.483364 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k8t8h" Sep 30 14:13:44 crc kubenswrapper[4936]: I0930 14:13:44.990084 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k8t8h"] Sep 30 14:13:45 crc kubenswrapper[4936]: I0930 14:13:45.138795 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k8t8h" event={"ID":"d779fd94-9fb2-4bd0-95c3-d7ac8885b589","Type":"ContainerStarted","Data":"6ecc109c3e1001d44a911330c80bf844aae00fd546dce0fc086351e2a7f8e78d"} Sep 30 14:13:46 crc kubenswrapper[4936]: I0930 14:13:46.150001 4936 generic.go:334] "Generic (PLEG): container finished" podID="d779fd94-9fb2-4bd0-95c3-d7ac8885b589" containerID="56455620a81b0dbbf5d006707d97775afbda384bb5fd7064507896f97cfd38c8" exitCode=0 Sep 30 14:13:46 crc kubenswrapper[4936]: I0930 14:13:46.150048 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k8t8h" event={"ID":"d779fd94-9fb2-4bd0-95c3-d7ac8885b589","Type":"ContainerDied","Data":"56455620a81b0dbbf5d006707d97775afbda384bb5fd7064507896f97cfd38c8"} Sep 30 14:13:57 crc kubenswrapper[4936]: I0930 14:13:57.240707 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k8t8h" event={"ID":"d779fd94-9fb2-4bd0-95c3-d7ac8885b589","Type":"ContainerStarted","Data":"47d4ee737e498e9ba73b72651615b827352f6de6e301dfa701ac0c79273c669a"} Sep 30 14:13:59 crc kubenswrapper[4936]: I0930 14:13:59.257642 4936 generic.go:334] "Generic (PLEG): container finished" podID="d779fd94-9fb2-4bd0-95c3-d7ac8885b589" containerID="47d4ee737e498e9ba73b72651615b827352f6de6e301dfa701ac0c79273c669a" exitCode=0 Sep 30 14:13:59 crc kubenswrapper[4936]: I0930 14:13:59.257695 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k8t8h" event={"ID":"d779fd94-9fb2-4bd0-95c3-d7ac8885b589","Type":"ContainerDied","Data":"47d4ee737e498e9ba73b72651615b827352f6de6e301dfa701ac0c79273c669a"} Sep 30 14:14:01 crc kubenswrapper[4936]: I0930 14:14:01.285835 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k8t8h" event={"ID":"d779fd94-9fb2-4bd0-95c3-d7ac8885b589","Type":"ContainerStarted","Data":"20269c1edcf0d26abccc197fc3c1df991313b0b14e15cd75f7bac6ca911d70fb"} Sep 30 14:14:04 crc kubenswrapper[4936]: I0930 14:14:04.483572 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k8t8h" Sep 30 14:14:04 crc kubenswrapper[4936]: I0930 14:14:04.484136 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k8t8h" Sep 30 14:14:05 crc kubenswrapper[4936]: I0930 14:14:05.526684 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k8t8h" podUID="d779fd94-9fb2-4bd0-95c3-d7ac8885b589" containerName="registry-server" probeResult="failure" output=< Sep 30 14:14:05 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 14:14:05 crc kubenswrapper[4936]: > Sep 30 14:14:14 crc kubenswrapper[4936]: I0930 14:14:14.527291 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k8t8h" Sep 30 14:14:14 crc kubenswrapper[4936]: I0930 14:14:14.558563 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k8t8h" podStartSLOduration=16.228986117 podStartE2EDuration="30.55854504s" podCreationTimestamp="2025-09-30 14:13:44 +0000 UTC" firstStartedPulling="2025-09-30 14:13:46.151842522 +0000 UTC m=+2076.535844823" lastFinishedPulling="2025-09-30 14:14:00.481401445 +0000 UTC m=+2090.865403746" observedRunningTime="2025-09-30 14:14:01.305532756 +0000 UTC m=+2091.689535057" watchObservedRunningTime="2025-09-30 14:14:14.55854504 +0000 UTC m=+2104.942547341" Sep 30 14:14:14 crc kubenswrapper[4936]: I0930 14:14:14.577561 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k8t8h" Sep 30 14:14:15 crc kubenswrapper[4936]: I0930 14:14:15.189462 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k8t8h"] Sep 30 14:14:15 crc kubenswrapper[4936]: I0930 14:14:15.363572 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bfp5r"] Sep 30 14:14:15 crc kubenswrapper[4936]: I0930 14:14:15.363838 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bfp5r" podUID="b315e18d-a8cb-41cc-8626-403e2204e403" containerName="registry-server" containerID="cri-o://a9e29143454a1ed872a47ad4becc9719cb35d06e261dfc398162b8d0bfbbb140" gracePeriod=2 Sep 30 14:14:15 crc kubenswrapper[4936]: I0930 14:14:15.913580 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bfp5r" Sep 30 14:14:15 crc kubenswrapper[4936]: I0930 14:14:15.965010 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b315e18d-a8cb-41cc-8626-403e2204e403-utilities\") pod \"b315e18d-a8cb-41cc-8626-403e2204e403\" (UID: \"b315e18d-a8cb-41cc-8626-403e2204e403\") " Sep 30 14:14:15 crc kubenswrapper[4936]: I0930 14:14:15.965176 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b315e18d-a8cb-41cc-8626-403e2204e403-catalog-content\") pod \"b315e18d-a8cb-41cc-8626-403e2204e403\" (UID: \"b315e18d-a8cb-41cc-8626-403e2204e403\") " Sep 30 14:14:15 crc kubenswrapper[4936]: I0930 14:14:15.965272 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts4zk\" (UniqueName: \"kubernetes.io/projected/b315e18d-a8cb-41cc-8626-403e2204e403-kube-api-access-ts4zk\") pod \"b315e18d-a8cb-41cc-8626-403e2204e403\" (UID: \"b315e18d-a8cb-41cc-8626-403e2204e403\") " Sep 30 14:14:15 crc kubenswrapper[4936]: I0930 14:14:15.974415 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b315e18d-a8cb-41cc-8626-403e2204e403-utilities" (OuterVolumeSpecName: "utilities") pod "b315e18d-a8cb-41cc-8626-403e2204e403" (UID: "b315e18d-a8cb-41cc-8626-403e2204e403"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:14:15 crc kubenswrapper[4936]: I0930 14:14:15.992276 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b315e18d-a8cb-41cc-8626-403e2204e403-kube-api-access-ts4zk" (OuterVolumeSpecName: "kube-api-access-ts4zk") pod "b315e18d-a8cb-41cc-8626-403e2204e403" (UID: "b315e18d-a8cb-41cc-8626-403e2204e403"). InnerVolumeSpecName "kube-api-access-ts4zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:14:16 crc kubenswrapper[4936]: I0930 14:14:16.067854 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b315e18d-a8cb-41cc-8626-403e2204e403-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:14:16 crc kubenswrapper[4936]: I0930 14:14:16.068324 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts4zk\" (UniqueName: \"kubernetes.io/projected/b315e18d-a8cb-41cc-8626-403e2204e403-kube-api-access-ts4zk\") on node \"crc\" DevicePath \"\"" Sep 30 14:14:16 crc kubenswrapper[4936]: I0930 14:14:16.083245 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b315e18d-a8cb-41cc-8626-403e2204e403-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b315e18d-a8cb-41cc-8626-403e2204e403" (UID: "b315e18d-a8cb-41cc-8626-403e2204e403"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:14:16 crc kubenswrapper[4936]: I0930 14:14:16.169714 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b315e18d-a8cb-41cc-8626-403e2204e403-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:14:16 crc kubenswrapper[4936]: I0930 14:14:16.427713 4936 generic.go:334] "Generic (PLEG): container finished" podID="b315e18d-a8cb-41cc-8626-403e2204e403" containerID="a9e29143454a1ed872a47ad4becc9719cb35d06e261dfc398162b8d0bfbbb140" exitCode=0 Sep 30 14:14:16 crc kubenswrapper[4936]: I0930 14:14:16.427785 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bfp5r" Sep 30 14:14:16 crc kubenswrapper[4936]: I0930 14:14:16.427852 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bfp5r" event={"ID":"b315e18d-a8cb-41cc-8626-403e2204e403","Type":"ContainerDied","Data":"a9e29143454a1ed872a47ad4becc9719cb35d06e261dfc398162b8d0bfbbb140"} Sep 30 14:14:16 crc kubenswrapper[4936]: I0930 14:14:16.427886 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bfp5r" event={"ID":"b315e18d-a8cb-41cc-8626-403e2204e403","Type":"ContainerDied","Data":"ee2fff6044568d75132cd04e916aefcc320754b005ef47494faf23ee7c113f48"} Sep 30 14:14:16 crc kubenswrapper[4936]: I0930 14:14:16.427907 4936 scope.go:117] "RemoveContainer" containerID="a9e29143454a1ed872a47ad4becc9719cb35d06e261dfc398162b8d0bfbbb140" Sep 30 14:14:16 crc kubenswrapper[4936]: I0930 14:14:16.453272 4936 scope.go:117] "RemoveContainer" containerID="2b8af11539896ff178382393aeef37a5f61809def80a7950a98092628ed8eb66" Sep 30 14:14:16 crc kubenswrapper[4936]: I0930 14:14:16.462851 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bfp5r"] Sep 30 14:14:16 crc kubenswrapper[4936]: I0930 14:14:16.472305 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bfp5r"] Sep 30 14:14:16 crc kubenswrapper[4936]: I0930 14:14:16.490454 4936 scope.go:117] "RemoveContainer" containerID="8bfab8286954eb633df503dbf1f0c491cb2df21a59ec0bbfd2036aa3e86cd674" Sep 30 14:14:16 crc kubenswrapper[4936]: I0930 14:14:16.520032 4936 scope.go:117] "RemoveContainer" containerID="a9e29143454a1ed872a47ad4becc9719cb35d06e261dfc398162b8d0bfbbb140" Sep 30 14:14:16 crc kubenswrapper[4936]: E0930 14:14:16.521363 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9e29143454a1ed872a47ad4becc9719cb35d06e261dfc398162b8d0bfbbb140\": container with ID starting with a9e29143454a1ed872a47ad4becc9719cb35d06e261dfc398162b8d0bfbbb140 not found: ID does not exist" containerID="a9e29143454a1ed872a47ad4becc9719cb35d06e261dfc398162b8d0bfbbb140" Sep 30 14:14:16 crc kubenswrapper[4936]: I0930 14:14:16.521438 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e29143454a1ed872a47ad4becc9719cb35d06e261dfc398162b8d0bfbbb140"} err="failed to get container status \"a9e29143454a1ed872a47ad4becc9719cb35d06e261dfc398162b8d0bfbbb140\": rpc error: code = NotFound desc = could not find container \"a9e29143454a1ed872a47ad4becc9719cb35d06e261dfc398162b8d0bfbbb140\": container with ID starting with a9e29143454a1ed872a47ad4becc9719cb35d06e261dfc398162b8d0bfbbb140 not found: ID does not exist" Sep 30 14:14:16 crc kubenswrapper[4936]: I0930 14:14:16.521485 4936 scope.go:117] "RemoveContainer" containerID="2b8af11539896ff178382393aeef37a5f61809def80a7950a98092628ed8eb66" Sep 30 14:14:16 crc kubenswrapper[4936]: E0930 14:14:16.522074 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b8af11539896ff178382393aeef37a5f61809def80a7950a98092628ed8eb66\": container with ID starting with 2b8af11539896ff178382393aeef37a5f61809def80a7950a98092628ed8eb66 not found: ID does not exist" containerID="2b8af11539896ff178382393aeef37a5f61809def80a7950a98092628ed8eb66" Sep 30 14:14:16 crc kubenswrapper[4936]: I0930 14:14:16.522169 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b8af11539896ff178382393aeef37a5f61809def80a7950a98092628ed8eb66"} err="failed to get container status \"2b8af11539896ff178382393aeef37a5f61809def80a7950a98092628ed8eb66\": rpc error: code = NotFound desc = could not find container \"2b8af11539896ff178382393aeef37a5f61809def80a7950a98092628ed8eb66\": container with ID starting with 2b8af11539896ff178382393aeef37a5f61809def80a7950a98092628ed8eb66 not found: ID does not exist" Sep 30 14:14:16 crc kubenswrapper[4936]: I0930 14:14:16.522246 4936 scope.go:117] "RemoveContainer" containerID="8bfab8286954eb633df503dbf1f0c491cb2df21a59ec0bbfd2036aa3e86cd674" Sep 30 14:14:16 crc kubenswrapper[4936]: E0930 14:14:16.522568 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bfab8286954eb633df503dbf1f0c491cb2df21a59ec0bbfd2036aa3e86cd674\": container with ID starting with 8bfab8286954eb633df503dbf1f0c491cb2df21a59ec0bbfd2036aa3e86cd674 not found: ID does not exist" containerID="8bfab8286954eb633df503dbf1f0c491cb2df21a59ec0bbfd2036aa3e86cd674" Sep 30 14:14:16 crc kubenswrapper[4936]: I0930 14:14:16.522664 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bfab8286954eb633df503dbf1f0c491cb2df21a59ec0bbfd2036aa3e86cd674"} err="failed to get container status \"8bfab8286954eb633df503dbf1f0c491cb2df21a59ec0bbfd2036aa3e86cd674\": rpc error: code = NotFound desc = could not find container \"8bfab8286954eb633df503dbf1f0c491cb2df21a59ec0bbfd2036aa3e86cd674\": container with ID starting with 8bfab8286954eb633df503dbf1f0c491cb2df21a59ec0bbfd2036aa3e86cd674 not found: ID does not exist" Sep 30 14:14:18 crc kubenswrapper[4936]: I0930 14:14:18.326386 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b315e18d-a8cb-41cc-8626-403e2204e403" path="/var/lib/kubelet/pods/b315e18d-a8cb-41cc-8626-403e2204e403/volumes" Sep 30 14:15:00 crc kubenswrapper[4936]: I0930 14:15:00.148043 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6"] Sep 30 14:15:00 crc kubenswrapper[4936]: E0930 14:15:00.149711 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b315e18d-a8cb-41cc-8626-403e2204e403" containerName="registry-server" Sep 30 14:15:00 crc kubenswrapper[4936]: I0930 14:15:00.149741 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b315e18d-a8cb-41cc-8626-403e2204e403" containerName="registry-server" Sep 30 14:15:00 crc kubenswrapper[4936]: E0930 14:15:00.149760 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b315e18d-a8cb-41cc-8626-403e2204e403" containerName="extract-content" Sep 30 14:15:00 crc kubenswrapper[4936]: I0930 14:15:00.149768 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b315e18d-a8cb-41cc-8626-403e2204e403" containerName="extract-content" Sep 30 14:15:00 crc kubenswrapper[4936]: E0930 14:15:00.149819 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b315e18d-a8cb-41cc-8626-403e2204e403" containerName="extract-utilities" Sep 30 14:15:00 crc kubenswrapper[4936]: I0930 14:15:00.149830 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b315e18d-a8cb-41cc-8626-403e2204e403" containerName="extract-utilities" Sep 30 14:15:00 crc kubenswrapper[4936]: I0930 14:15:00.150090 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b315e18d-a8cb-41cc-8626-403e2204e403" containerName="registry-server" Sep 30 14:15:00 crc kubenswrapper[4936]: I0930 14:15:00.155250 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6" Sep 30 14:15:00 crc kubenswrapper[4936]: I0930 14:15:00.159378 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 14:15:00 crc kubenswrapper[4936]: I0930 14:15:00.159322 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 14:15:00 crc kubenswrapper[4936]: I0930 14:15:00.162698 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6"] Sep 30 14:15:00 crc kubenswrapper[4936]: I0930 14:15:00.273202 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b80ba637-9833-410d-8ca9-012957837b73-config-volume\") pod \"collect-profiles-29320695-6cps6\" (UID: \"b80ba637-9833-410d-8ca9-012957837b73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6" Sep 30 14:15:00 crc kubenswrapper[4936]: I0930 14:15:00.274085 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b80ba637-9833-410d-8ca9-012957837b73-secret-volume\") pod \"collect-profiles-29320695-6cps6\" (UID: \"b80ba637-9833-410d-8ca9-012957837b73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6" Sep 30 14:15:00 crc kubenswrapper[4936]: I0930 14:15:00.274147 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqv4j\" (UniqueName: \"kubernetes.io/projected/b80ba637-9833-410d-8ca9-012957837b73-kube-api-access-tqv4j\") pod \"collect-profiles-29320695-6cps6\" (UID: \"b80ba637-9833-410d-8ca9-012957837b73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6" Sep 30 14:15:00 crc kubenswrapper[4936]: I0930 14:15:00.376394 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b80ba637-9833-410d-8ca9-012957837b73-config-volume\") pod \"collect-profiles-29320695-6cps6\" (UID: \"b80ba637-9833-410d-8ca9-012957837b73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6" Sep 30 14:15:00 crc kubenswrapper[4936]: I0930 14:15:00.377580 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b80ba637-9833-410d-8ca9-012957837b73-config-volume\") pod \"collect-profiles-29320695-6cps6\" (UID: \"b80ba637-9833-410d-8ca9-012957837b73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6" Sep 30 14:15:00 crc kubenswrapper[4936]: I0930 14:15:00.377763 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b80ba637-9833-410d-8ca9-012957837b73-secret-volume\") pod \"collect-profiles-29320695-6cps6\" (UID: \"b80ba637-9833-410d-8ca9-012957837b73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6" Sep 30 14:15:00 crc kubenswrapper[4936]: I0930 14:15:00.377934 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqv4j\" (UniqueName: \"kubernetes.io/projected/b80ba637-9833-410d-8ca9-012957837b73-kube-api-access-tqv4j\") pod \"collect-profiles-29320695-6cps6\" (UID: \"b80ba637-9833-410d-8ca9-012957837b73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6" Sep 30 14:15:00 crc kubenswrapper[4936]: I0930 14:15:00.384089 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b80ba637-9833-410d-8ca9-012957837b73-secret-volume\") pod \"collect-profiles-29320695-6cps6\" (UID: \"b80ba637-9833-410d-8ca9-012957837b73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6" Sep 30 14:15:00 crc kubenswrapper[4936]: I0930 14:15:00.401563 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqv4j\" (UniqueName: \"kubernetes.io/projected/b80ba637-9833-410d-8ca9-012957837b73-kube-api-access-tqv4j\") pod \"collect-profiles-29320695-6cps6\" (UID: \"b80ba637-9833-410d-8ca9-012957837b73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6" Sep 30 14:15:00 crc kubenswrapper[4936]: I0930 14:15:00.490840 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6" Sep 30 14:15:01 crc kubenswrapper[4936]: I0930 14:15:01.104601 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6"] Sep 30 14:15:01 crc kubenswrapper[4936]: I0930 14:15:01.815601 4936 generic.go:334] "Generic (PLEG): container finished" podID="b80ba637-9833-410d-8ca9-012957837b73" containerID="f358bbc3c56b5e7a9ff0718b6063f2ebfc0dc9ebac2ab4d54453bccfb01a4dfc" exitCode=0 Sep 30 14:15:01 crc kubenswrapper[4936]: I0930 14:15:01.815669 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6" event={"ID":"b80ba637-9833-410d-8ca9-012957837b73","Type":"ContainerDied","Data":"f358bbc3c56b5e7a9ff0718b6063f2ebfc0dc9ebac2ab4d54453bccfb01a4dfc"} Sep 30 14:15:01 crc kubenswrapper[4936]: I0930 14:15:01.816156 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6" event={"ID":"b80ba637-9833-410d-8ca9-012957837b73","Type":"ContainerStarted","Data":"eccf6111d8edefb592c1de95c8083364caec7e2b3de5cfffb6ac956f6a742e6b"} Sep 30 14:15:03 crc kubenswrapper[4936]: I0930 14:15:03.147518 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6" Sep 30 14:15:03 crc kubenswrapper[4936]: I0930 14:15:03.151261 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b80ba637-9833-410d-8ca9-012957837b73-secret-volume\") pod \"b80ba637-9833-410d-8ca9-012957837b73\" (UID: \"b80ba637-9833-410d-8ca9-012957837b73\") " Sep 30 14:15:03 crc kubenswrapper[4936]: I0930 14:15:03.151359 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqv4j\" (UniqueName: \"kubernetes.io/projected/b80ba637-9833-410d-8ca9-012957837b73-kube-api-access-tqv4j\") pod \"b80ba637-9833-410d-8ca9-012957837b73\" (UID: \"b80ba637-9833-410d-8ca9-012957837b73\") " Sep 30 14:15:03 crc kubenswrapper[4936]: I0930 14:15:03.151429 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b80ba637-9833-410d-8ca9-012957837b73-config-volume\") pod \"b80ba637-9833-410d-8ca9-012957837b73\" (UID: \"b80ba637-9833-410d-8ca9-012957837b73\") " Sep 30 14:15:03 crc kubenswrapper[4936]: I0930 14:15:03.152638 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b80ba637-9833-410d-8ca9-012957837b73-config-volume" (OuterVolumeSpecName: "config-volume") pod "b80ba637-9833-410d-8ca9-012957837b73" (UID: "b80ba637-9833-410d-8ca9-012957837b73"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:15:03 crc kubenswrapper[4936]: I0930 14:15:03.162217 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b80ba637-9833-410d-8ca9-012957837b73-kube-api-access-tqv4j" (OuterVolumeSpecName: "kube-api-access-tqv4j") pod "b80ba637-9833-410d-8ca9-012957837b73" (UID: "b80ba637-9833-410d-8ca9-012957837b73"). InnerVolumeSpecName "kube-api-access-tqv4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:15:03 crc kubenswrapper[4936]: I0930 14:15:03.186549 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b80ba637-9833-410d-8ca9-012957837b73-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b80ba637-9833-410d-8ca9-012957837b73" (UID: "b80ba637-9833-410d-8ca9-012957837b73"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:15:03 crc kubenswrapper[4936]: I0930 14:15:03.253838 4936 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b80ba637-9833-410d-8ca9-012957837b73-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:15:03 crc kubenswrapper[4936]: I0930 14:15:03.253887 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqv4j\" (UniqueName: \"kubernetes.io/projected/b80ba637-9833-410d-8ca9-012957837b73-kube-api-access-tqv4j\") on node \"crc\" DevicePath \"\"" Sep 30 14:15:03 crc kubenswrapper[4936]: I0930 14:15:03.253896 4936 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b80ba637-9833-410d-8ca9-012957837b73-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:15:03 crc kubenswrapper[4936]: I0930 14:15:03.832965 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6" event={"ID":"b80ba637-9833-410d-8ca9-012957837b73","Type":"ContainerDied","Data":"eccf6111d8edefb592c1de95c8083364caec7e2b3de5cfffb6ac956f6a742e6b"} Sep 30 14:15:03 crc kubenswrapper[4936]: I0930 14:15:03.833241 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eccf6111d8edefb592c1de95c8083364caec7e2b3de5cfffb6ac956f6a742e6b" Sep 30 14:15:03 crc kubenswrapper[4936]: I0930 14:15:03.833071 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6" Sep 30 14:15:04 crc kubenswrapper[4936]: I0930 14:15:04.222064 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz"] Sep 30 14:15:04 crc kubenswrapper[4936]: I0930 14:15:04.229826 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320650-cqhgz"] Sep 30 14:15:04 crc kubenswrapper[4936]: I0930 14:15:04.345655 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="661fcd49-29e9-4299-8fa7-9696bb5d1944" path="/var/lib/kubelet/pods/661fcd49-29e9-4299-8fa7-9696bb5d1944/volumes" Sep 30 14:15:16 crc kubenswrapper[4936]: I0930 14:15:16.341748 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-68pkv"] Sep 30 14:15:16 crc kubenswrapper[4936]: E0930 14:15:16.343706 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b80ba637-9833-410d-8ca9-012957837b73" containerName="collect-profiles" Sep 30 14:15:16 crc kubenswrapper[4936]: I0930 14:15:16.343894 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b80ba637-9833-410d-8ca9-012957837b73" containerName="collect-profiles" Sep 30 14:15:16 crc kubenswrapper[4936]: I0930 14:15:16.344187 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b80ba637-9833-410d-8ca9-012957837b73" containerName="collect-profiles" Sep 30 14:15:16 crc kubenswrapper[4936]: I0930 14:15:16.346351 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-68pkv" Sep 30 14:15:16 crc kubenswrapper[4936]: I0930 14:15:16.373815 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-68pkv"] Sep 30 14:15:16 crc kubenswrapper[4936]: I0930 14:15:16.511063 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-797n8\" (UniqueName: \"kubernetes.io/projected/08923044-b2d7-46aa-95c9-10455ce97de6-kube-api-access-797n8\") pod \"certified-operators-68pkv\" (UID: \"08923044-b2d7-46aa-95c9-10455ce97de6\") " pod="openshift-marketplace/certified-operators-68pkv" Sep 30 14:15:16 crc kubenswrapper[4936]: I0930 14:15:16.511136 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08923044-b2d7-46aa-95c9-10455ce97de6-catalog-content\") pod \"certified-operators-68pkv\" (UID: \"08923044-b2d7-46aa-95c9-10455ce97de6\") " pod="openshift-marketplace/certified-operators-68pkv" Sep 30 14:15:16 crc kubenswrapper[4936]: I0930 14:15:16.511170 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08923044-b2d7-46aa-95c9-10455ce97de6-utilities\") pod \"certified-operators-68pkv\" (UID: \"08923044-b2d7-46aa-95c9-10455ce97de6\") " pod="openshift-marketplace/certified-operators-68pkv" Sep 30 14:15:16 crc kubenswrapper[4936]: I0930 14:15:16.613514 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-797n8\" (UniqueName: \"kubernetes.io/projected/08923044-b2d7-46aa-95c9-10455ce97de6-kube-api-access-797n8\") pod \"certified-operators-68pkv\" (UID: \"08923044-b2d7-46aa-95c9-10455ce97de6\") " pod="openshift-marketplace/certified-operators-68pkv" Sep 30 14:15:16 crc kubenswrapper[4936]: I0930 14:15:16.613602 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08923044-b2d7-46aa-95c9-10455ce97de6-catalog-content\") pod \"certified-operators-68pkv\" (UID: \"08923044-b2d7-46aa-95c9-10455ce97de6\") " pod="openshift-marketplace/certified-operators-68pkv" Sep 30 14:15:16 crc kubenswrapper[4936]: I0930 14:15:16.613649 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08923044-b2d7-46aa-95c9-10455ce97de6-utilities\") pod \"certified-operators-68pkv\" (UID: \"08923044-b2d7-46aa-95c9-10455ce97de6\") " pod="openshift-marketplace/certified-operators-68pkv" Sep 30 14:15:16 crc kubenswrapper[4936]: I0930 14:15:16.614197 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08923044-b2d7-46aa-95c9-10455ce97de6-catalog-content\") pod \"certified-operators-68pkv\" (UID: \"08923044-b2d7-46aa-95c9-10455ce97de6\") " pod="openshift-marketplace/certified-operators-68pkv" Sep 30 14:15:16 crc kubenswrapper[4936]: I0930 14:15:16.614407 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08923044-b2d7-46aa-95c9-10455ce97de6-utilities\") pod \"certified-operators-68pkv\" (UID: \"08923044-b2d7-46aa-95c9-10455ce97de6\") " pod="openshift-marketplace/certified-operators-68pkv" Sep 30 14:15:16 crc kubenswrapper[4936]: I0930 14:15:16.636220 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-797n8\" (UniqueName: \"kubernetes.io/projected/08923044-b2d7-46aa-95c9-10455ce97de6-kube-api-access-797n8\") pod \"certified-operators-68pkv\" (UID: \"08923044-b2d7-46aa-95c9-10455ce97de6\") " pod="openshift-marketplace/certified-operators-68pkv" Sep 30 14:15:16 crc kubenswrapper[4936]: I0930 14:15:16.721774 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-68pkv" Sep 30 14:15:17 crc kubenswrapper[4936]: I0930 14:15:17.254715 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-68pkv"] Sep 30 14:15:17 crc kubenswrapper[4936]: I0930 14:15:17.952625 4936 generic.go:334] "Generic (PLEG): container finished" podID="08923044-b2d7-46aa-95c9-10455ce97de6" containerID="5a86142c0d7a0d8d4d968eb8c30f5cc3c0e0d34400fae54e4f403d9f99a7e05e" exitCode=0 Sep 30 14:15:17 crc kubenswrapper[4936]: I0930 14:15:17.952701 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68pkv" event={"ID":"08923044-b2d7-46aa-95c9-10455ce97de6","Type":"ContainerDied","Data":"5a86142c0d7a0d8d4d968eb8c30f5cc3c0e0d34400fae54e4f403d9f99a7e05e"} Sep 30 14:15:17 crc kubenswrapper[4936]: I0930 14:15:17.954537 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68pkv" event={"ID":"08923044-b2d7-46aa-95c9-10455ce97de6","Type":"ContainerStarted","Data":"27870c8c4d2f25f8cfb71eced6540c4125a785876932b511611174b7fc6002e1"} Sep 30 14:15:18 crc kubenswrapper[4936]: I0930 14:15:18.249845 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:15:18 crc kubenswrapper[4936]: I0930 14:15:18.250212 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:15:18 crc kubenswrapper[4936]: I0930 14:15:18.962924 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68pkv" event={"ID":"08923044-b2d7-46aa-95c9-10455ce97de6","Type":"ContainerStarted","Data":"0c19de10523eda5f605a2c5ddcd8cec3d100c6e97a76b8cfa81736ed41fe56cf"} Sep 30 14:15:20 crc kubenswrapper[4936]: I0930 14:15:20.978852 4936 generic.go:334] "Generic (PLEG): container finished" podID="08923044-b2d7-46aa-95c9-10455ce97de6" containerID="0c19de10523eda5f605a2c5ddcd8cec3d100c6e97a76b8cfa81736ed41fe56cf" exitCode=0 Sep 30 14:15:20 crc kubenswrapper[4936]: I0930 14:15:20.979224 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68pkv" event={"ID":"08923044-b2d7-46aa-95c9-10455ce97de6","Type":"ContainerDied","Data":"0c19de10523eda5f605a2c5ddcd8cec3d100c6e97a76b8cfa81736ed41fe56cf"} Sep 30 14:15:21 crc kubenswrapper[4936]: I0930 14:15:21.990133 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68pkv" event={"ID":"08923044-b2d7-46aa-95c9-10455ce97de6","Type":"ContainerStarted","Data":"59af52a0ca5ff048aa5a71d0fcbdf06435134b713f526684d9f45bd4fb6bcf7f"} Sep 30 14:15:22 crc kubenswrapper[4936]: I0930 14:15:22.030454 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-68pkv" podStartSLOduration=2.39219781 podStartE2EDuration="6.030392188s" podCreationTimestamp="2025-09-30 14:15:16 +0000 UTC" firstStartedPulling="2025-09-30 14:15:17.954546356 +0000 UTC m=+2168.338548657" lastFinishedPulling="2025-09-30 14:15:21.592740734 +0000 UTC m=+2171.976743035" observedRunningTime="2025-09-30 14:15:22.016001171 +0000 UTC m=+2172.400003492" watchObservedRunningTime="2025-09-30 14:15:22.030392188 +0000 UTC m=+2172.414394479" Sep 30 14:15:26 crc kubenswrapper[4936]: I0930 14:15:26.722248 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-68pkv" Sep 30 14:15:26 crc kubenswrapper[4936]: I0930 14:15:26.722880 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-68pkv" Sep 30 14:15:26 crc kubenswrapper[4936]: I0930 14:15:26.772256 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-68pkv" Sep 30 14:15:27 crc kubenswrapper[4936]: I0930 14:15:27.070435 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-68pkv" Sep 30 14:15:27 crc kubenswrapper[4936]: I0930 14:15:27.125214 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-68pkv"] Sep 30 14:15:29 crc kubenswrapper[4936]: I0930 14:15:29.041897 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-68pkv" podUID="08923044-b2d7-46aa-95c9-10455ce97de6" containerName="registry-server" containerID="cri-o://59af52a0ca5ff048aa5a71d0fcbdf06435134b713f526684d9f45bd4fb6bcf7f" gracePeriod=2 Sep 30 14:15:29 crc kubenswrapper[4936]: I0930 14:15:29.505156 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-68pkv" Sep 30 14:15:29 crc kubenswrapper[4936]: I0930 14:15:29.665738 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-797n8\" (UniqueName: \"kubernetes.io/projected/08923044-b2d7-46aa-95c9-10455ce97de6-kube-api-access-797n8\") pod \"08923044-b2d7-46aa-95c9-10455ce97de6\" (UID: \"08923044-b2d7-46aa-95c9-10455ce97de6\") " Sep 30 14:15:29 crc kubenswrapper[4936]: I0930 14:15:29.666820 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08923044-b2d7-46aa-95c9-10455ce97de6-catalog-content\") pod \"08923044-b2d7-46aa-95c9-10455ce97de6\" (UID: \"08923044-b2d7-46aa-95c9-10455ce97de6\") " Sep 30 14:15:29 crc kubenswrapper[4936]: I0930 14:15:29.666878 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08923044-b2d7-46aa-95c9-10455ce97de6-utilities\") pod \"08923044-b2d7-46aa-95c9-10455ce97de6\" (UID: \"08923044-b2d7-46aa-95c9-10455ce97de6\") " Sep 30 14:15:29 crc kubenswrapper[4936]: I0930 14:15:29.667567 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08923044-b2d7-46aa-95c9-10455ce97de6-utilities" (OuterVolumeSpecName: "utilities") pod "08923044-b2d7-46aa-95c9-10455ce97de6" (UID: "08923044-b2d7-46aa-95c9-10455ce97de6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:15:29 crc kubenswrapper[4936]: I0930 14:15:29.681576 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08923044-b2d7-46aa-95c9-10455ce97de6-kube-api-access-797n8" (OuterVolumeSpecName: "kube-api-access-797n8") pod "08923044-b2d7-46aa-95c9-10455ce97de6" (UID: "08923044-b2d7-46aa-95c9-10455ce97de6"). InnerVolumeSpecName "kube-api-access-797n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:15:29 crc kubenswrapper[4936]: I0930 14:15:29.718069 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08923044-b2d7-46aa-95c9-10455ce97de6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08923044-b2d7-46aa-95c9-10455ce97de6" (UID: "08923044-b2d7-46aa-95c9-10455ce97de6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:15:29 crc kubenswrapper[4936]: I0930 14:15:29.769196 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-797n8\" (UniqueName: \"kubernetes.io/projected/08923044-b2d7-46aa-95c9-10455ce97de6-kube-api-access-797n8\") on node \"crc\" DevicePath \"\"" Sep 30 14:15:29 crc kubenswrapper[4936]: I0930 14:15:29.769247 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08923044-b2d7-46aa-95c9-10455ce97de6-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:15:29 crc kubenswrapper[4936]: I0930 14:15:29.769258 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08923044-b2d7-46aa-95c9-10455ce97de6-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:15:30 crc kubenswrapper[4936]: I0930 14:15:30.052730 4936 generic.go:334] "Generic (PLEG): container finished" podID="08923044-b2d7-46aa-95c9-10455ce97de6" containerID="59af52a0ca5ff048aa5a71d0fcbdf06435134b713f526684d9f45bd4fb6bcf7f" exitCode=0 Sep 30 14:15:30 crc kubenswrapper[4936]: I0930 14:15:30.052787 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-68pkv" Sep 30 14:15:30 crc kubenswrapper[4936]: I0930 14:15:30.052794 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68pkv" event={"ID":"08923044-b2d7-46aa-95c9-10455ce97de6","Type":"ContainerDied","Data":"59af52a0ca5ff048aa5a71d0fcbdf06435134b713f526684d9f45bd4fb6bcf7f"} Sep 30 14:15:30 crc kubenswrapper[4936]: I0930 14:15:30.053271 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68pkv" event={"ID":"08923044-b2d7-46aa-95c9-10455ce97de6","Type":"ContainerDied","Data":"27870c8c4d2f25f8cfb71eced6540c4125a785876932b511611174b7fc6002e1"} Sep 30 14:15:30 crc kubenswrapper[4936]: I0930 14:15:30.053293 4936 scope.go:117] "RemoveContainer" containerID="59af52a0ca5ff048aa5a71d0fcbdf06435134b713f526684d9f45bd4fb6bcf7f" Sep 30 14:15:30 crc kubenswrapper[4936]: I0930 14:15:30.088983 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-68pkv"] Sep 30 14:15:30 crc kubenswrapper[4936]: I0930 14:15:30.096967 4936 scope.go:117] "RemoveContainer" containerID="0c19de10523eda5f605a2c5ddcd8cec3d100c6e97a76b8cfa81736ed41fe56cf" Sep 30 14:15:30 crc kubenswrapper[4936]: I0930 14:15:30.099232 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-68pkv"] Sep 30 14:15:30 crc kubenswrapper[4936]: I0930 14:15:30.128306 4936 scope.go:117] "RemoveContainer" containerID="5a86142c0d7a0d8d4d968eb8c30f5cc3c0e0d34400fae54e4f403d9f99a7e05e" Sep 30 14:15:30 crc kubenswrapper[4936]: I0930 14:15:30.163013 4936 scope.go:117] "RemoveContainer" containerID="59af52a0ca5ff048aa5a71d0fcbdf06435134b713f526684d9f45bd4fb6bcf7f" Sep 30 14:15:30 crc kubenswrapper[4936]: E0930 14:15:30.163587 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59af52a0ca5ff048aa5a71d0fcbdf06435134b713f526684d9f45bd4fb6bcf7f\": container with ID starting with 59af52a0ca5ff048aa5a71d0fcbdf06435134b713f526684d9f45bd4fb6bcf7f not found: ID does not exist" containerID="59af52a0ca5ff048aa5a71d0fcbdf06435134b713f526684d9f45bd4fb6bcf7f" Sep 30 14:15:30 crc kubenswrapper[4936]: I0930 14:15:30.163640 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59af52a0ca5ff048aa5a71d0fcbdf06435134b713f526684d9f45bd4fb6bcf7f"} err="failed to get container status \"59af52a0ca5ff048aa5a71d0fcbdf06435134b713f526684d9f45bd4fb6bcf7f\": rpc error: code = NotFound desc = could not find container \"59af52a0ca5ff048aa5a71d0fcbdf06435134b713f526684d9f45bd4fb6bcf7f\": container with ID starting with 59af52a0ca5ff048aa5a71d0fcbdf06435134b713f526684d9f45bd4fb6bcf7f not found: ID does not exist" Sep 30 14:15:30 crc kubenswrapper[4936]: I0930 14:15:30.163674 4936 scope.go:117] "RemoveContainer" containerID="0c19de10523eda5f605a2c5ddcd8cec3d100c6e97a76b8cfa81736ed41fe56cf" Sep 30 14:15:30 crc kubenswrapper[4936]: E0930 14:15:30.164609 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c19de10523eda5f605a2c5ddcd8cec3d100c6e97a76b8cfa81736ed41fe56cf\": container with ID starting with 0c19de10523eda5f605a2c5ddcd8cec3d100c6e97a76b8cfa81736ed41fe56cf not found: ID does not exist" containerID="0c19de10523eda5f605a2c5ddcd8cec3d100c6e97a76b8cfa81736ed41fe56cf" Sep 30 14:15:30 crc kubenswrapper[4936]: I0930 14:15:30.164670 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c19de10523eda5f605a2c5ddcd8cec3d100c6e97a76b8cfa81736ed41fe56cf"} err="failed to get container status \"0c19de10523eda5f605a2c5ddcd8cec3d100c6e97a76b8cfa81736ed41fe56cf\": rpc error: code = NotFound desc = could not find container \"0c19de10523eda5f605a2c5ddcd8cec3d100c6e97a76b8cfa81736ed41fe56cf\": container with ID starting with 0c19de10523eda5f605a2c5ddcd8cec3d100c6e97a76b8cfa81736ed41fe56cf not found: ID does not exist" Sep 30 14:15:30 crc kubenswrapper[4936]: I0930 14:15:30.164695 4936 scope.go:117] "RemoveContainer" containerID="5a86142c0d7a0d8d4d968eb8c30f5cc3c0e0d34400fae54e4f403d9f99a7e05e" Sep 30 14:15:30 crc kubenswrapper[4936]: E0930 14:15:30.164985 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a86142c0d7a0d8d4d968eb8c30f5cc3c0e0d34400fae54e4f403d9f99a7e05e\": container with ID starting with 5a86142c0d7a0d8d4d968eb8c30f5cc3c0e0d34400fae54e4f403d9f99a7e05e not found: ID does not exist" containerID="5a86142c0d7a0d8d4d968eb8c30f5cc3c0e0d34400fae54e4f403d9f99a7e05e" Sep 30 14:15:30 crc kubenswrapper[4936]: I0930 14:15:30.165019 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a86142c0d7a0d8d4d968eb8c30f5cc3c0e0d34400fae54e4f403d9f99a7e05e"} err="failed to get container status \"5a86142c0d7a0d8d4d968eb8c30f5cc3c0e0d34400fae54e4f403d9f99a7e05e\": rpc error: code = NotFound desc = could not find container \"5a86142c0d7a0d8d4d968eb8c30f5cc3c0e0d34400fae54e4f403d9f99a7e05e\": container with ID starting with 5a86142c0d7a0d8d4d968eb8c30f5cc3c0e0d34400fae54e4f403d9f99a7e05e not found: ID does not exist" Sep 30 14:15:30 crc kubenswrapper[4936]: I0930 14:15:30.327996 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08923044-b2d7-46aa-95c9-10455ce97de6" path="/var/lib/kubelet/pods/08923044-b2d7-46aa-95c9-10455ce97de6/volumes" Sep 30 14:15:38 crc kubenswrapper[4936]: I0930 14:15:38.400808 4936 scope.go:117] "RemoveContainer" containerID="227d7785a4d3ba90022b984891dc38dd3c518d3b85c575898a13d368fb20d8e1" Sep 30 14:15:48 crc kubenswrapper[4936]: I0930 14:15:48.250010 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:15:48 crc kubenswrapper[4936]: I0930 14:15:48.250609 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:16:18 crc kubenswrapper[4936]: I0930 14:16:18.250612 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:16:18 crc kubenswrapper[4936]: I0930 14:16:18.252542 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:16:18 crc kubenswrapper[4936]: I0930 14:16:18.252716 4936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 14:16:18 crc kubenswrapper[4936]: I0930 14:16:18.253356 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a"} pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:16:18 crc kubenswrapper[4936]: I0930 14:16:18.253474 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" containerID="cri-o://393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" gracePeriod=600 Sep 30 14:16:18 crc kubenswrapper[4936]: E0930 14:16:18.374283 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:16:18 crc kubenswrapper[4936]: I0930 14:16:18.451170 4936 generic.go:334] "Generic (PLEG): container finished" podID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" exitCode=0 Sep 30 14:16:18 crc kubenswrapper[4936]: I0930 14:16:18.451244 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerDied","Data":"393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a"} Sep 30 14:16:18 crc kubenswrapper[4936]: I0930 14:16:18.451549 4936 scope.go:117] "RemoveContainer" containerID="0553bde46e59815e1b7923e522aa41b591dec050dfe87aa6f12d9ea1207d6552" Sep 30 14:16:18 crc kubenswrapper[4936]: I0930 14:16:18.452454 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:16:18 crc kubenswrapper[4936]: E0930 14:16:18.452818 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:16:30 crc kubenswrapper[4936]: I0930 14:16:30.322278 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:16:30 crc kubenswrapper[4936]: E0930 14:16:30.323261 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:16:44 crc kubenswrapper[4936]: I0930 14:16:44.315634 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:16:44 crc kubenswrapper[4936]: E0930 14:16:44.316619 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:16:57 crc kubenswrapper[4936]: I0930 14:16:57.315869 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:16:57 crc kubenswrapper[4936]: E0930 14:16:57.316911 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:17:11 crc kubenswrapper[4936]: I0930 14:17:11.315648 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:17:11 crc kubenswrapper[4936]: E0930 14:17:11.316350 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.446004 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.454966 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.462320 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.471359 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.480591 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.489225 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.496477 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mn5bj"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.502479 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.508245 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.514524 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-77hzh"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.520143 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-b8bcf"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.526430 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.533128 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wm6xm"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.540313 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h6k5w"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.547141 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.572778 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-b8bcf"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.581857 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mknzq"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.588786 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-scc99"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.594829 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-f79cx"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.601076 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bbmjg"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.608283 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pmhl9"] Sep 30 14:17:16 crc kubenswrapper[4936]: I0930 14:17:16.615489 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xt4n5"] Sep 30 14:17:18 crc kubenswrapper[4936]: I0930 14:17:18.324551 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aaaa50a-6929-4654-b42b-ccfcd712d106" path="/var/lib/kubelet/pods/0aaaa50a-6929-4654-b42b-ccfcd712d106/volumes" Sep 30 14:17:18 crc kubenswrapper[4936]: I0930 14:17:18.325098 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b58aac-6cb3-4b3f-ba54-e036399e7270" path="/var/lib/kubelet/pods/20b58aac-6cb3-4b3f-ba54-e036399e7270/volumes" Sep 30 14:17:18 crc kubenswrapper[4936]: I0930 14:17:18.325677 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="277fa445-5647-4778-9d81-1653da4c6df9" path="/var/lib/kubelet/pods/277fa445-5647-4778-9d81-1653da4c6df9/volumes" Sep 30 14:17:18 crc kubenswrapper[4936]: I0930 14:17:18.326170 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4524efd0-030f-4c85-93ee-88a1abfd34f8" path="/var/lib/kubelet/pods/4524efd0-030f-4c85-93ee-88a1abfd34f8/volumes" Sep 30 14:17:18 crc kubenswrapper[4936]: I0930 14:17:18.327153 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4599dc3b-5593-4f89-a693-7281c98a534e" path="/var/lib/kubelet/pods/4599dc3b-5593-4f89-a693-7281c98a534e/volumes" Sep 30 14:17:18 crc kubenswrapper[4936]: I0930 14:17:18.327699 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="808224a4-e93e-4ee8-81b6-ad06d7888ad5" path="/var/lib/kubelet/pods/808224a4-e93e-4ee8-81b6-ad06d7888ad5/volumes" Sep 30 14:17:18 crc kubenswrapper[4936]: I0930 14:17:18.328210 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90a18518-5de3-4b3f-87b7-c34e2c4c52b6" path="/var/lib/kubelet/pods/90a18518-5de3-4b3f-87b7-c34e2c4c52b6/volumes" Sep 30 14:17:18 crc kubenswrapper[4936]: I0930 14:17:18.329190 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98c83542-c990-48c0-8113-17bd96de3cc0" path="/var/lib/kubelet/pods/98c83542-c990-48c0-8113-17bd96de3cc0/volumes" Sep 30 14:17:18 crc kubenswrapper[4936]: I0930 14:17:18.329798 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7275541-3778-4de4-8ad3-a8fcbf1953d3" path="/var/lib/kubelet/pods/a7275541-3778-4de4-8ad3-a8fcbf1953d3/volumes" Sep 30 14:17:18 crc kubenswrapper[4936]: I0930 14:17:18.330467 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4cdff18-db93-4160-80b9-b4589b47756d" path="/var/lib/kubelet/pods/f4cdff18-db93-4160-80b9-b4589b47756d/volumes" Sep 30 14:17:18 crc kubenswrapper[4936]: I0930 14:17:18.331417 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8c2a378-8126-4f54-bba8-7c2cb53cd1b9" path="/var/lib/kubelet/pods/f8c2a378-8126-4f54-bba8-7c2cb53cd1b9/volumes" Sep 30 14:17:25 crc kubenswrapper[4936]: I0930 14:17:25.315256 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:17:25 crc kubenswrapper[4936]: E0930 14:17:25.315952 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.257107 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f"] Sep 30 14:17:30 crc kubenswrapper[4936]: E0930 14:17:30.258466 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08923044-b2d7-46aa-95c9-10455ce97de6" containerName="registry-server" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.258479 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="08923044-b2d7-46aa-95c9-10455ce97de6" containerName="registry-server" Sep 30 14:17:30 crc kubenswrapper[4936]: E0930 14:17:30.258488 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08923044-b2d7-46aa-95c9-10455ce97de6" containerName="extract-content" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.258493 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="08923044-b2d7-46aa-95c9-10455ce97de6" containerName="extract-content" Sep 30 14:17:30 crc kubenswrapper[4936]: E0930 14:17:30.258514 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08923044-b2d7-46aa-95c9-10455ce97de6" containerName="extract-utilities" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.258520 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="08923044-b2d7-46aa-95c9-10455ce97de6" containerName="extract-utilities" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.258732 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="08923044-b2d7-46aa-95c9-10455ce97de6" containerName="registry-server" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.260120 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.263525 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.263537 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.263714 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.263724 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.265640 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.304091 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f"] Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.368608 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f\" (UID: \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.368722 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f\" (UID: \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.368759 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f\" (UID: \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.368799 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f\" (UID: \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.369262 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfd54\" (UniqueName: \"kubernetes.io/projected/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-kube-api-access-dfd54\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f\" (UID: \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.471145 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f\" (UID: \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.471221 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f\" (UID: \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.471241 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f\" (UID: \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.471280 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f\" (UID: \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.471675 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfd54\" (UniqueName: \"kubernetes.io/projected/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-kube-api-access-dfd54\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f\" (UID: \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.478971 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f\" (UID: \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.487049 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f\" (UID: \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.488480 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f\" (UID: \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.493789 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfd54\" (UniqueName: \"kubernetes.io/projected/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-kube-api-access-dfd54\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f\" (UID: \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.497870 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f\" (UID: \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" Sep 30 14:17:30 crc kubenswrapper[4936]: I0930 14:17:30.594481 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" Sep 30 14:17:31 crc kubenswrapper[4936]: I0930 14:17:31.125021 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f"] Sep 30 14:17:31 crc kubenswrapper[4936]: I0930 14:17:31.270173 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-568r6"] Sep 30 14:17:31 crc kubenswrapper[4936]: I0930 14:17:31.272005 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-568r6" Sep 30 14:17:31 crc kubenswrapper[4936]: I0930 14:17:31.302358 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-568r6"] Sep 30 14:17:31 crc kubenswrapper[4936]: I0930 14:17:31.390320 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df0f3672-416d-47aa-a75b-a224d444ba1f-utilities\") pod \"redhat-marketplace-568r6\" (UID: \"df0f3672-416d-47aa-a75b-a224d444ba1f\") " pod="openshift-marketplace/redhat-marketplace-568r6" Sep 30 14:17:31 crc kubenswrapper[4936]: I0930 14:17:31.390432 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q8tx\" (UniqueName: \"kubernetes.io/projected/df0f3672-416d-47aa-a75b-a224d444ba1f-kube-api-access-8q8tx\") pod \"redhat-marketplace-568r6\" (UID: \"df0f3672-416d-47aa-a75b-a224d444ba1f\") " pod="openshift-marketplace/redhat-marketplace-568r6" Sep 30 14:17:31 crc kubenswrapper[4936]: I0930 14:17:31.390465 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df0f3672-416d-47aa-a75b-a224d444ba1f-catalog-content\") pod \"redhat-marketplace-568r6\" (UID: \"df0f3672-416d-47aa-a75b-a224d444ba1f\") " pod="openshift-marketplace/redhat-marketplace-568r6" Sep 30 14:17:31 crc kubenswrapper[4936]: I0930 14:17:31.491677 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df0f3672-416d-47aa-a75b-a224d444ba1f-utilities\") pod \"redhat-marketplace-568r6\" (UID: \"df0f3672-416d-47aa-a75b-a224d444ba1f\") " pod="openshift-marketplace/redhat-marketplace-568r6" Sep 30 14:17:31 crc kubenswrapper[4936]: I0930 14:17:31.491815 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q8tx\" (UniqueName: \"kubernetes.io/projected/df0f3672-416d-47aa-a75b-a224d444ba1f-kube-api-access-8q8tx\") pod \"redhat-marketplace-568r6\" (UID: \"df0f3672-416d-47aa-a75b-a224d444ba1f\") " pod="openshift-marketplace/redhat-marketplace-568r6" Sep 30 14:17:31 crc kubenswrapper[4936]: I0930 14:17:31.491850 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df0f3672-416d-47aa-a75b-a224d444ba1f-catalog-content\") pod \"redhat-marketplace-568r6\" (UID: \"df0f3672-416d-47aa-a75b-a224d444ba1f\") " pod="openshift-marketplace/redhat-marketplace-568r6" Sep 30 14:17:31 crc kubenswrapper[4936]: I0930 14:17:31.492118 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df0f3672-416d-47aa-a75b-a224d444ba1f-utilities\") pod \"redhat-marketplace-568r6\" (UID: \"df0f3672-416d-47aa-a75b-a224d444ba1f\") " pod="openshift-marketplace/redhat-marketplace-568r6" Sep 30 14:17:31 crc kubenswrapper[4936]: I0930 14:17:31.493259 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df0f3672-416d-47aa-a75b-a224d444ba1f-catalog-content\") pod \"redhat-marketplace-568r6\" (UID: \"df0f3672-416d-47aa-a75b-a224d444ba1f\") " pod="openshift-marketplace/redhat-marketplace-568r6" Sep 30 14:17:31 crc kubenswrapper[4936]: I0930 14:17:31.526223 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q8tx\" (UniqueName: \"kubernetes.io/projected/df0f3672-416d-47aa-a75b-a224d444ba1f-kube-api-access-8q8tx\") pod \"redhat-marketplace-568r6\" (UID: \"df0f3672-416d-47aa-a75b-a224d444ba1f\") " pod="openshift-marketplace/redhat-marketplace-568r6" Sep 30 14:17:31 crc kubenswrapper[4936]: I0930 14:17:31.604039 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-568r6" Sep 30 14:17:32 crc kubenswrapper[4936]: I0930 14:17:32.070294 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-568r6"] Sep 30 14:17:32 crc kubenswrapper[4936]: W0930 14:17:32.083412 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf0f3672_416d_47aa_a75b_a224d444ba1f.slice/crio-50ca172c320cab4d286869802db0bdc54e6e31a5356ce51af5fc4cbada77f07b WatchSource:0}: Error finding container 50ca172c320cab4d286869802db0bdc54e6e31a5356ce51af5fc4cbada77f07b: Status 404 returned error can't find the container with id 50ca172c320cab4d286869802db0bdc54e6e31a5356ce51af5fc4cbada77f07b Sep 30 14:17:32 crc kubenswrapper[4936]: I0930 14:17:32.104785 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" event={"ID":"e4365ea1-ca48-47bf-af32-3e82c0a5da8f","Type":"ContainerStarted","Data":"0fa1913f888e652f4bd563ad8fe766902c87ed4d59fd6e2a2fc4bc6c70403db3"} Sep 30 14:17:32 crc kubenswrapper[4936]: I0930 14:17:32.104828 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" event={"ID":"e4365ea1-ca48-47bf-af32-3e82c0a5da8f","Type":"ContainerStarted","Data":"775f42522e7105aaee14cff3cc8934480f9c5fde9f54c3926a820e51d5d2ecc6"} Sep 30 14:17:32 crc kubenswrapper[4936]: I0930 14:17:32.105931 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-568r6" event={"ID":"df0f3672-416d-47aa-a75b-a224d444ba1f","Type":"ContainerStarted","Data":"50ca172c320cab4d286869802db0bdc54e6e31a5356ce51af5fc4cbada77f07b"} Sep 30 14:17:32 crc kubenswrapper[4936]: I0930 14:17:32.122481 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" podStartSLOduration=1.57436469 podStartE2EDuration="2.12246774s" podCreationTimestamp="2025-09-30 14:17:30 +0000 UTC" firstStartedPulling="2025-09-30 14:17:31.13300158 +0000 UTC m=+2301.517003871" lastFinishedPulling="2025-09-30 14:17:31.68110462 +0000 UTC m=+2302.065106921" observedRunningTime="2025-09-30 14:17:32.119818407 +0000 UTC m=+2302.503820708" watchObservedRunningTime="2025-09-30 14:17:32.12246774 +0000 UTC m=+2302.506470041" Sep 30 14:17:33 crc kubenswrapper[4936]: I0930 14:17:33.116850 4936 generic.go:334] "Generic (PLEG): container finished" podID="df0f3672-416d-47aa-a75b-a224d444ba1f" containerID="e643e15c5864e1c1a0a53df9df02d45b9866a047a80b7e7dc609afc2844f6ddd" exitCode=0 Sep 30 14:17:33 crc kubenswrapper[4936]: I0930 14:17:33.118220 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-568r6" event={"ID":"df0f3672-416d-47aa-a75b-a224d444ba1f","Type":"ContainerDied","Data":"e643e15c5864e1c1a0a53df9df02d45b9866a047a80b7e7dc609afc2844f6ddd"} Sep 30 14:17:35 crc kubenswrapper[4936]: I0930 14:17:35.149294 4936 generic.go:334] "Generic (PLEG): container finished" podID="df0f3672-416d-47aa-a75b-a224d444ba1f" containerID="4ccc7e2496f4da3d12d743548e619c26a86a22dea3fbbdec009500bf1aac48ab" exitCode=0 Sep 30 14:17:35 crc kubenswrapper[4936]: I0930 14:17:35.149619 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-568r6" event={"ID":"df0f3672-416d-47aa-a75b-a224d444ba1f","Type":"ContainerDied","Data":"4ccc7e2496f4da3d12d743548e619c26a86a22dea3fbbdec009500bf1aac48ab"} Sep 30 14:17:36 crc kubenswrapper[4936]: I0930 14:17:36.160111 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-568r6" event={"ID":"df0f3672-416d-47aa-a75b-a224d444ba1f","Type":"ContainerStarted","Data":"463df41730ac49c99ff0006fc6e24f5014ec85c4fd3dd771e0045fa29d88f02d"} Sep 30 14:17:36 crc kubenswrapper[4936]: I0930 14:17:36.184725 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-568r6" podStartSLOduration=2.499226542 podStartE2EDuration="5.184709192s" podCreationTimestamp="2025-09-30 14:17:31 +0000 UTC" firstStartedPulling="2025-09-30 14:17:33.120954976 +0000 UTC m=+2303.504957277" lastFinishedPulling="2025-09-30 14:17:35.806437616 +0000 UTC m=+2306.190439927" observedRunningTime="2025-09-30 14:17:36.182232504 +0000 UTC m=+2306.566234815" watchObservedRunningTime="2025-09-30 14:17:36.184709192 +0000 UTC m=+2306.568711483" Sep 30 14:17:36 crc kubenswrapper[4936]: I0930 14:17:36.316002 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:17:36 crc kubenswrapper[4936]: E0930 14:17:36.316324 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:17:38 crc kubenswrapper[4936]: I0930 14:17:38.506495 4936 scope.go:117] "RemoveContainer" containerID="9fe64979b53c159f6a00c90b57a10131a14059975709a0e6c31ca9b79c717e0b" Sep 30 14:17:38 crc kubenswrapper[4936]: I0930 14:17:38.539912 4936 scope.go:117] "RemoveContainer" containerID="66bde40905f722d5f8da8e187a0bc98ecd5b2bb6e0557a94acd035411d785c4f" Sep 30 14:17:38 crc kubenswrapper[4936]: I0930 14:17:38.585155 4936 scope.go:117] "RemoveContainer" containerID="a79dcd6405a37b1702409d7e4f9c9945d8768888be2001051b9e65b91a147cc3" Sep 30 14:17:38 crc kubenswrapper[4936]: I0930 14:17:38.662687 4936 scope.go:117] "RemoveContainer" containerID="ddd7850344c3ca6ec4eb88ecb4f895b05eb41bce698c15cf7096af9402d00087" Sep 30 14:17:38 crc kubenswrapper[4936]: I0930 14:17:38.719196 4936 scope.go:117] "RemoveContainer" containerID="465c8223e0a763ac438627de1cd9f0d4275c567ea504cd60baca168b9e901e39" Sep 30 14:17:38 crc kubenswrapper[4936]: I0930 14:17:38.765540 4936 scope.go:117] "RemoveContainer" containerID="d811d9862ddbe51b97a9262d1798a6f985f8de23f5fda18cbe34d25b51bfe0f4" Sep 30 14:17:38 crc kubenswrapper[4936]: I0930 14:17:38.802496 4936 scope.go:117] "RemoveContainer" containerID="0dff08732339e1b583d0b75d3601c0b458cb041aa1b0d04690b3d23b4928983c" Sep 30 14:17:38 crc kubenswrapper[4936]: I0930 14:17:38.898802 4936 scope.go:117] "RemoveContainer" containerID="32b61c4c92d318cc0117e285407b16925714e9ff39ee077daf598911b7a1d49d" Sep 30 14:17:38 crc kubenswrapper[4936]: I0930 14:17:38.965561 4936 scope.go:117] "RemoveContainer" containerID="1bb3bbb97189ada9bd7fccd62f6da36d599b587a60a3e4d532824006c0a13517" Sep 30 14:17:38 crc kubenswrapper[4936]: I0930 14:17:38.997191 4936 scope.go:117] "RemoveContainer" containerID="8211be7ad09f98644eaec8878ca9cd406fdb543a4a45cf24770a81d076921130" Sep 30 14:17:39 crc kubenswrapper[4936]: I0930 14:17:39.026706 4936 scope.go:117] "RemoveContainer" containerID="3983d129d634053d41003d76b524b501d3713e6a1d127dfea30e78141f0c8ad4" Sep 30 14:17:41 crc kubenswrapper[4936]: I0930 14:17:41.604423 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-568r6" Sep 30 14:17:41 crc kubenswrapper[4936]: I0930 14:17:41.604783 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-568r6" Sep 30 14:17:41 crc kubenswrapper[4936]: I0930 14:17:41.647759 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-568r6" Sep 30 14:17:42 crc kubenswrapper[4936]: I0930 14:17:42.267291 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-568r6" Sep 30 14:17:43 crc kubenswrapper[4936]: I0930 14:17:43.260472 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-568r6"] Sep 30 14:17:44 crc kubenswrapper[4936]: I0930 14:17:44.239941 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-568r6" podUID="df0f3672-416d-47aa-a75b-a224d444ba1f" containerName="registry-server" containerID="cri-o://463df41730ac49c99ff0006fc6e24f5014ec85c4fd3dd771e0045fa29d88f02d" gracePeriod=2 Sep 30 14:17:44 crc kubenswrapper[4936]: I0930 14:17:44.687811 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-568r6" Sep 30 14:17:44 crc kubenswrapper[4936]: I0930 14:17:44.769660 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df0f3672-416d-47aa-a75b-a224d444ba1f-catalog-content\") pod \"df0f3672-416d-47aa-a75b-a224d444ba1f\" (UID: \"df0f3672-416d-47aa-a75b-a224d444ba1f\") " Sep 30 14:17:44 crc kubenswrapper[4936]: I0930 14:17:44.769969 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q8tx\" (UniqueName: \"kubernetes.io/projected/df0f3672-416d-47aa-a75b-a224d444ba1f-kube-api-access-8q8tx\") pod \"df0f3672-416d-47aa-a75b-a224d444ba1f\" (UID: \"df0f3672-416d-47aa-a75b-a224d444ba1f\") " Sep 30 14:17:44 crc kubenswrapper[4936]: I0930 14:17:44.770840 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df0f3672-416d-47aa-a75b-a224d444ba1f-utilities\") pod \"df0f3672-416d-47aa-a75b-a224d444ba1f\" (UID: \"df0f3672-416d-47aa-a75b-a224d444ba1f\") " Sep 30 14:17:44 crc kubenswrapper[4936]: I0930 14:17:44.771483 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df0f3672-416d-47aa-a75b-a224d444ba1f-utilities" (OuterVolumeSpecName: "utilities") pod "df0f3672-416d-47aa-a75b-a224d444ba1f" (UID: "df0f3672-416d-47aa-a75b-a224d444ba1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:17:44 crc kubenswrapper[4936]: I0930 14:17:44.776839 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df0f3672-416d-47aa-a75b-a224d444ba1f-kube-api-access-8q8tx" (OuterVolumeSpecName: "kube-api-access-8q8tx") pod "df0f3672-416d-47aa-a75b-a224d444ba1f" (UID: "df0f3672-416d-47aa-a75b-a224d444ba1f"). InnerVolumeSpecName "kube-api-access-8q8tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:17:44 crc kubenswrapper[4936]: I0930 14:17:44.785699 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df0f3672-416d-47aa-a75b-a224d444ba1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df0f3672-416d-47aa-a75b-a224d444ba1f" (UID: "df0f3672-416d-47aa-a75b-a224d444ba1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:17:44 crc kubenswrapper[4936]: I0930 14:17:44.872923 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df0f3672-416d-47aa-a75b-a224d444ba1f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:44 crc kubenswrapper[4936]: I0930 14:17:44.872965 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df0f3672-416d-47aa-a75b-a224d444ba1f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:44 crc kubenswrapper[4936]: I0930 14:17:44.872978 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q8tx\" (UniqueName: \"kubernetes.io/projected/df0f3672-416d-47aa-a75b-a224d444ba1f-kube-api-access-8q8tx\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:45 crc kubenswrapper[4936]: I0930 14:17:45.248434 4936 generic.go:334] "Generic (PLEG): container finished" podID="df0f3672-416d-47aa-a75b-a224d444ba1f" containerID="463df41730ac49c99ff0006fc6e24f5014ec85c4fd3dd771e0045fa29d88f02d" exitCode=0 Sep 30 14:17:45 crc kubenswrapper[4936]: I0930 14:17:45.248499 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-568r6" Sep 30 14:17:45 crc kubenswrapper[4936]: I0930 14:17:45.248510 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-568r6" event={"ID":"df0f3672-416d-47aa-a75b-a224d444ba1f","Type":"ContainerDied","Data":"463df41730ac49c99ff0006fc6e24f5014ec85c4fd3dd771e0045fa29d88f02d"} Sep 30 14:17:45 crc kubenswrapper[4936]: I0930 14:17:45.248935 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-568r6" event={"ID":"df0f3672-416d-47aa-a75b-a224d444ba1f","Type":"ContainerDied","Data":"50ca172c320cab4d286869802db0bdc54e6e31a5356ce51af5fc4cbada77f07b"} Sep 30 14:17:45 crc kubenswrapper[4936]: I0930 14:17:45.248957 4936 scope.go:117] "RemoveContainer" containerID="463df41730ac49c99ff0006fc6e24f5014ec85c4fd3dd771e0045fa29d88f02d" Sep 30 14:17:45 crc kubenswrapper[4936]: I0930 14:17:45.266072 4936 scope.go:117] "RemoveContainer" containerID="4ccc7e2496f4da3d12d743548e619c26a86a22dea3fbbdec009500bf1aac48ab" Sep 30 14:17:45 crc kubenswrapper[4936]: I0930 14:17:45.287449 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-568r6"] Sep 30 14:17:45 crc kubenswrapper[4936]: I0930 14:17:45.296954 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-568r6"] Sep 30 14:17:45 crc kubenswrapper[4936]: I0930 14:17:45.297593 4936 scope.go:117] "RemoveContainer" containerID="e643e15c5864e1c1a0a53df9df02d45b9866a047a80b7e7dc609afc2844f6ddd" Sep 30 14:17:45 crc kubenswrapper[4936]: I0930 14:17:45.341711 4936 scope.go:117] "RemoveContainer" containerID="463df41730ac49c99ff0006fc6e24f5014ec85c4fd3dd771e0045fa29d88f02d" Sep 30 14:17:45 crc kubenswrapper[4936]: E0930 14:17:45.342175 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"463df41730ac49c99ff0006fc6e24f5014ec85c4fd3dd771e0045fa29d88f02d\": container with ID starting with 463df41730ac49c99ff0006fc6e24f5014ec85c4fd3dd771e0045fa29d88f02d not found: ID does not exist" containerID="463df41730ac49c99ff0006fc6e24f5014ec85c4fd3dd771e0045fa29d88f02d" Sep 30 14:17:45 crc kubenswrapper[4936]: I0930 14:17:45.342213 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"463df41730ac49c99ff0006fc6e24f5014ec85c4fd3dd771e0045fa29d88f02d"} err="failed to get container status \"463df41730ac49c99ff0006fc6e24f5014ec85c4fd3dd771e0045fa29d88f02d\": rpc error: code = NotFound desc = could not find container \"463df41730ac49c99ff0006fc6e24f5014ec85c4fd3dd771e0045fa29d88f02d\": container with ID starting with 463df41730ac49c99ff0006fc6e24f5014ec85c4fd3dd771e0045fa29d88f02d not found: ID does not exist" Sep 30 14:17:45 crc kubenswrapper[4936]: I0930 14:17:45.342238 4936 scope.go:117] "RemoveContainer" containerID="4ccc7e2496f4da3d12d743548e619c26a86a22dea3fbbdec009500bf1aac48ab" Sep 30 14:17:45 crc kubenswrapper[4936]: E0930 14:17:45.342498 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ccc7e2496f4da3d12d743548e619c26a86a22dea3fbbdec009500bf1aac48ab\": container with ID starting with 4ccc7e2496f4da3d12d743548e619c26a86a22dea3fbbdec009500bf1aac48ab not found: ID does not exist" containerID="4ccc7e2496f4da3d12d743548e619c26a86a22dea3fbbdec009500bf1aac48ab" Sep 30 14:17:45 crc kubenswrapper[4936]: I0930 14:17:45.342519 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ccc7e2496f4da3d12d743548e619c26a86a22dea3fbbdec009500bf1aac48ab"} err="failed to get container status \"4ccc7e2496f4da3d12d743548e619c26a86a22dea3fbbdec009500bf1aac48ab\": rpc error: code = NotFound desc = could not find container \"4ccc7e2496f4da3d12d743548e619c26a86a22dea3fbbdec009500bf1aac48ab\": container with ID starting with 4ccc7e2496f4da3d12d743548e619c26a86a22dea3fbbdec009500bf1aac48ab not found: ID does not exist" Sep 30 14:17:45 crc kubenswrapper[4936]: I0930 14:17:45.342535 4936 scope.go:117] "RemoveContainer" containerID="e643e15c5864e1c1a0a53df9df02d45b9866a047a80b7e7dc609afc2844f6ddd" Sep 30 14:17:45 crc kubenswrapper[4936]: E0930 14:17:45.342904 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e643e15c5864e1c1a0a53df9df02d45b9866a047a80b7e7dc609afc2844f6ddd\": container with ID starting with e643e15c5864e1c1a0a53df9df02d45b9866a047a80b7e7dc609afc2844f6ddd not found: ID does not exist" containerID="e643e15c5864e1c1a0a53df9df02d45b9866a047a80b7e7dc609afc2844f6ddd" Sep 30 14:17:45 crc kubenswrapper[4936]: I0930 14:17:45.342934 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e643e15c5864e1c1a0a53df9df02d45b9866a047a80b7e7dc609afc2844f6ddd"} err="failed to get container status \"e643e15c5864e1c1a0a53df9df02d45b9866a047a80b7e7dc609afc2844f6ddd\": rpc error: code = NotFound desc = could not find container \"e643e15c5864e1c1a0a53df9df02d45b9866a047a80b7e7dc609afc2844f6ddd\": container with ID starting with e643e15c5864e1c1a0a53df9df02d45b9866a047a80b7e7dc609afc2844f6ddd not found: ID does not exist" Sep 30 14:17:46 crc kubenswrapper[4936]: I0930 14:17:46.257626 4936 generic.go:334] "Generic (PLEG): container finished" podID="e4365ea1-ca48-47bf-af32-3e82c0a5da8f" containerID="0fa1913f888e652f4bd563ad8fe766902c87ed4d59fd6e2a2fc4bc6c70403db3" exitCode=0 Sep 30 14:17:46 crc kubenswrapper[4936]: I0930 14:17:46.257724 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" event={"ID":"e4365ea1-ca48-47bf-af32-3e82c0a5da8f","Type":"ContainerDied","Data":"0fa1913f888e652f4bd563ad8fe766902c87ed4d59fd6e2a2fc4bc6c70403db3"} Sep 30 14:17:46 crc kubenswrapper[4936]: I0930 14:17:46.326170 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df0f3672-416d-47aa-a75b-a224d444ba1f" path="/var/lib/kubelet/pods/df0f3672-416d-47aa-a75b-a224d444ba1f/volumes" Sep 30 14:17:47 crc kubenswrapper[4936]: I0930 14:17:47.646410 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" Sep 30 14:17:47 crc kubenswrapper[4936]: I0930 14:17:47.720820 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-inventory\") pod \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\" (UID: \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\") " Sep 30 14:17:47 crc kubenswrapper[4936]: I0930 14:17:47.721161 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfd54\" (UniqueName: \"kubernetes.io/projected/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-kube-api-access-dfd54\") pod \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\" (UID: \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\") " Sep 30 14:17:47 crc kubenswrapper[4936]: I0930 14:17:47.721272 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-ssh-key\") pod \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\" (UID: \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\") " Sep 30 14:17:47 crc kubenswrapper[4936]: I0930 14:17:47.721295 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-repo-setup-combined-ca-bundle\") pod \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\" (UID: \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\") " Sep 30 14:17:47 crc kubenswrapper[4936]: I0930 14:17:47.721417 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-ceph\") pod \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\" (UID: \"e4365ea1-ca48-47bf-af32-3e82c0a5da8f\") " Sep 30 14:17:47 crc kubenswrapper[4936]: I0930 14:17:47.727906 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e4365ea1-ca48-47bf-af32-3e82c0a5da8f" (UID: "e4365ea1-ca48-47bf-af32-3e82c0a5da8f"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:17:47 crc kubenswrapper[4936]: I0930 14:17:47.730501 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-kube-api-access-dfd54" (OuterVolumeSpecName: "kube-api-access-dfd54") pod "e4365ea1-ca48-47bf-af32-3e82c0a5da8f" (UID: "e4365ea1-ca48-47bf-af32-3e82c0a5da8f"). InnerVolumeSpecName "kube-api-access-dfd54". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:17:47 crc kubenswrapper[4936]: I0930 14:17:47.731876 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-ceph" (OuterVolumeSpecName: "ceph") pod "e4365ea1-ca48-47bf-af32-3e82c0a5da8f" (UID: "e4365ea1-ca48-47bf-af32-3e82c0a5da8f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:17:47 crc kubenswrapper[4936]: I0930 14:17:47.755740 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e4365ea1-ca48-47bf-af32-3e82c0a5da8f" (UID: "e4365ea1-ca48-47bf-af32-3e82c0a5da8f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:17:47 crc kubenswrapper[4936]: I0930 14:17:47.756092 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-inventory" (OuterVolumeSpecName: "inventory") pod "e4365ea1-ca48-47bf-af32-3e82c0a5da8f" (UID: "e4365ea1-ca48-47bf-af32-3e82c0a5da8f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:17:47 crc kubenswrapper[4936]: I0930 14:17:47.823844 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:47 crc kubenswrapper[4936]: I0930 14:17:47.823869 4936 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:47 crc kubenswrapper[4936]: I0930 14:17:47.823881 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:47 crc kubenswrapper[4936]: I0930 14:17:47.823907 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:47 crc kubenswrapper[4936]: I0930 14:17:47.823917 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfd54\" (UniqueName: \"kubernetes.io/projected/e4365ea1-ca48-47bf-af32-3e82c0a5da8f-kube-api-access-dfd54\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.274192 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" event={"ID":"e4365ea1-ca48-47bf-af32-3e82c0a5da8f","Type":"ContainerDied","Data":"775f42522e7105aaee14cff3cc8934480f9c5fde9f54c3926a820e51d5d2ecc6"} Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.274235 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="775f42522e7105aaee14cff3cc8934480f9c5fde9f54c3926a820e51d5d2ecc6" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.274293 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.396164 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k"] Sep 30 14:17:48 crc kubenswrapper[4936]: E0930 14:17:48.397596 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0f3672-416d-47aa-a75b-a224d444ba1f" containerName="registry-server" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.397715 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0f3672-416d-47aa-a75b-a224d444ba1f" containerName="registry-server" Sep 30 14:17:48 crc kubenswrapper[4936]: E0930 14:17:48.397827 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0f3672-416d-47aa-a75b-a224d444ba1f" containerName="extract-utilities" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.398975 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0f3672-416d-47aa-a75b-a224d444ba1f" containerName="extract-utilities" Sep 30 14:17:48 crc kubenswrapper[4936]: E0930 14:17:48.399084 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4365ea1-ca48-47bf-af32-3e82c0a5da8f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.399155 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4365ea1-ca48-47bf-af32-3e82c0a5da8f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 14:17:48 crc kubenswrapper[4936]: E0930 14:17:48.399233 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0f3672-416d-47aa-a75b-a224d444ba1f" containerName="extract-content" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.399302 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0f3672-416d-47aa-a75b-a224d444ba1f" containerName="extract-content" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.399705 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="df0f3672-416d-47aa-a75b-a224d444ba1f" containerName="registry-server" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.399805 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4365ea1-ca48-47bf-af32-3e82c0a5da8f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.400731 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.402957 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.403222 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.403398 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.403690 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.405748 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.410107 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k"] Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.535572 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k\" (UID: \"418f655d-bae8-4905-8dfc-770612a750c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.536010 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k\" (UID: \"418f655d-bae8-4905-8dfc-770612a750c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.536164 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9z82\" (UniqueName: \"kubernetes.io/projected/418f655d-bae8-4905-8dfc-770612a750c4-kube-api-access-r9z82\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k\" (UID: \"418f655d-bae8-4905-8dfc-770612a750c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.536371 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k\" (UID: \"418f655d-bae8-4905-8dfc-770612a750c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.536517 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k\" (UID: \"418f655d-bae8-4905-8dfc-770612a750c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.638375 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k\" (UID: \"418f655d-bae8-4905-8dfc-770612a750c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.638470 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k\" (UID: \"418f655d-bae8-4905-8dfc-770612a750c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.638505 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9z82\" (UniqueName: \"kubernetes.io/projected/418f655d-bae8-4905-8dfc-770612a750c4-kube-api-access-r9z82\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k\" (UID: \"418f655d-bae8-4905-8dfc-770612a750c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.638550 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k\" (UID: \"418f655d-bae8-4905-8dfc-770612a750c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.638585 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k\" (UID: \"418f655d-bae8-4905-8dfc-770612a750c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.642512 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k\" (UID: \"418f655d-bae8-4905-8dfc-770612a750c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.642935 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k\" (UID: \"418f655d-bae8-4905-8dfc-770612a750c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.643548 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k\" (UID: \"418f655d-bae8-4905-8dfc-770612a750c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.658995 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k\" (UID: \"418f655d-bae8-4905-8dfc-770612a750c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.660430 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9z82\" (UniqueName: \"kubernetes.io/projected/418f655d-bae8-4905-8dfc-770612a750c4-kube-api-access-r9z82\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k\" (UID: \"418f655d-bae8-4905-8dfc-770612a750c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" Sep 30 14:17:48 crc kubenswrapper[4936]: I0930 14:17:48.720374 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" Sep 30 14:17:49 crc kubenswrapper[4936]: I0930 14:17:49.240967 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k"] Sep 30 14:17:49 crc kubenswrapper[4936]: I0930 14:17:49.285358 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" event={"ID":"418f655d-bae8-4905-8dfc-770612a750c4","Type":"ContainerStarted","Data":"1f8d1f2fa6507d16ed640c02015e30c5b9d139cd0a5b71d5f4073fd062c6ad06"} Sep 30 14:17:49 crc kubenswrapper[4936]: I0930 14:17:49.316234 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:17:49 crc kubenswrapper[4936]: E0930 14:17:49.316486 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:17:50 crc kubenswrapper[4936]: I0930 14:17:50.296911 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" event={"ID":"418f655d-bae8-4905-8dfc-770612a750c4","Type":"ContainerStarted","Data":"fb91c183f5ae47ab708e4f79ea3306aa2514a818e81a5c65f5b78412205a195c"} Sep 30 14:17:50 crc kubenswrapper[4936]: I0930 14:17:50.322469 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" podStartSLOduration=1.750847832 podStartE2EDuration="2.322447548s" podCreationTimestamp="2025-09-30 14:17:48 +0000 UTC" firstStartedPulling="2025-09-30 14:17:49.252304693 +0000 UTC m=+2319.636306984" lastFinishedPulling="2025-09-30 14:17:49.823904399 +0000 UTC m=+2320.207906700" observedRunningTime="2025-09-30 14:17:50.317114662 +0000 UTC m=+2320.701116973" watchObservedRunningTime="2025-09-30 14:17:50.322447548 +0000 UTC m=+2320.706449849" Sep 30 14:18:04 crc kubenswrapper[4936]: I0930 14:18:04.315974 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:18:04 crc kubenswrapper[4936]: E0930 14:18:04.316831 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:18:18 crc kubenswrapper[4936]: I0930 14:18:18.315480 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:18:18 crc kubenswrapper[4936]: E0930 14:18:18.316264 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:18:33 crc kubenswrapper[4936]: I0930 14:18:33.315103 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:18:33 crc kubenswrapper[4936]: E0930 14:18:33.315907 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:18:44 crc kubenswrapper[4936]: I0930 14:18:44.320239 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:18:44 crc kubenswrapper[4936]: E0930 14:18:44.320978 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:18:56 crc kubenswrapper[4936]: I0930 14:18:56.315260 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:18:56 crc kubenswrapper[4936]: E0930 14:18:56.316055 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:19:07 crc kubenswrapper[4936]: I0930 14:19:07.315576 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:19:07 crc kubenswrapper[4936]: E0930 14:19:07.316781 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:19:21 crc kubenswrapper[4936]: I0930 14:19:21.315866 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:19:21 crc kubenswrapper[4936]: E0930 14:19:21.317502 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:19:34 crc kubenswrapper[4936]: I0930 14:19:34.096202 4936 generic.go:334] "Generic (PLEG): container finished" podID="418f655d-bae8-4905-8dfc-770612a750c4" containerID="fb91c183f5ae47ab708e4f79ea3306aa2514a818e81a5c65f5b78412205a195c" exitCode=0 Sep 30 14:19:34 crc kubenswrapper[4936]: I0930 14:19:34.096291 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" event={"ID":"418f655d-bae8-4905-8dfc-770612a750c4","Type":"ContainerDied","Data":"fb91c183f5ae47ab708e4f79ea3306aa2514a818e81a5c65f5b78412205a195c"} Sep 30 14:19:35 crc kubenswrapper[4936]: I0930 14:19:35.320110 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:19:35 crc kubenswrapper[4936]: E0930 14:19:35.320831 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:19:35 crc kubenswrapper[4936]: I0930 14:19:35.538556 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" Sep 30 14:19:35 crc kubenswrapper[4936]: I0930 14:19:35.636887 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-bootstrap-combined-ca-bundle\") pod \"418f655d-bae8-4905-8dfc-770612a750c4\" (UID: \"418f655d-bae8-4905-8dfc-770612a750c4\") " Sep 30 14:19:35 crc kubenswrapper[4936]: I0930 14:19:35.637237 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-ssh-key\") pod \"418f655d-bae8-4905-8dfc-770612a750c4\" (UID: \"418f655d-bae8-4905-8dfc-770612a750c4\") " Sep 30 14:19:35 crc kubenswrapper[4936]: I0930 14:19:35.637325 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-inventory\") pod \"418f655d-bae8-4905-8dfc-770612a750c4\" (UID: \"418f655d-bae8-4905-8dfc-770612a750c4\") " Sep 30 14:19:35 crc kubenswrapper[4936]: I0930 14:19:35.637398 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9z82\" (UniqueName: \"kubernetes.io/projected/418f655d-bae8-4905-8dfc-770612a750c4-kube-api-access-r9z82\") pod \"418f655d-bae8-4905-8dfc-770612a750c4\" (UID: \"418f655d-bae8-4905-8dfc-770612a750c4\") " Sep 30 14:19:35 crc kubenswrapper[4936]: I0930 14:19:35.637437 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-ceph\") pod \"418f655d-bae8-4905-8dfc-770612a750c4\" (UID: \"418f655d-bae8-4905-8dfc-770612a750c4\") " Sep 30 14:19:35 crc kubenswrapper[4936]: I0930 14:19:35.643839 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "418f655d-bae8-4905-8dfc-770612a750c4" (UID: "418f655d-bae8-4905-8dfc-770612a750c4"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:19:35 crc kubenswrapper[4936]: I0930 14:19:35.644760 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-ceph" (OuterVolumeSpecName: "ceph") pod "418f655d-bae8-4905-8dfc-770612a750c4" (UID: "418f655d-bae8-4905-8dfc-770612a750c4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:19:35 crc kubenswrapper[4936]: I0930 14:19:35.650278 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/418f655d-bae8-4905-8dfc-770612a750c4-kube-api-access-r9z82" (OuterVolumeSpecName: "kube-api-access-r9z82") pod "418f655d-bae8-4905-8dfc-770612a750c4" (UID: "418f655d-bae8-4905-8dfc-770612a750c4"). InnerVolumeSpecName "kube-api-access-r9z82". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:19:35 crc kubenswrapper[4936]: I0930 14:19:35.672503 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-inventory" (OuterVolumeSpecName: "inventory") pod "418f655d-bae8-4905-8dfc-770612a750c4" (UID: "418f655d-bae8-4905-8dfc-770612a750c4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:19:35 crc kubenswrapper[4936]: I0930 14:19:35.673581 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "418f655d-bae8-4905-8dfc-770612a750c4" (UID: "418f655d-bae8-4905-8dfc-770612a750c4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:19:35 crc kubenswrapper[4936]: I0930 14:19:35.738952 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:35 crc kubenswrapper[4936]: I0930 14:19:35.738991 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9z82\" (UniqueName: \"kubernetes.io/projected/418f655d-bae8-4905-8dfc-770612a750c4-kube-api-access-r9z82\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:35 crc kubenswrapper[4936]: I0930 14:19:35.739003 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:35 crc kubenswrapper[4936]: I0930 14:19:35.739012 4936 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:35 crc kubenswrapper[4936]: I0930 14:19:35.739021 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/418f655d-bae8-4905-8dfc-770612a750c4-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.116316 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" event={"ID":"418f655d-bae8-4905-8dfc-770612a750c4","Type":"ContainerDied","Data":"1f8d1f2fa6507d16ed640c02015e30c5b9d139cd0a5b71d5f4073fd062c6ad06"} Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.116379 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f8d1f2fa6507d16ed640c02015e30c5b9d139cd0a5b71d5f4073fd062c6ad06" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.116409 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.221237 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf"] Sep 30 14:19:36 crc kubenswrapper[4936]: E0930 14:19:36.221676 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="418f655d-bae8-4905-8dfc-770612a750c4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.221700 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="418f655d-bae8-4905-8dfc-770612a750c4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.221869 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="418f655d-bae8-4905-8dfc-770612a750c4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.222488 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.235298 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf"] Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.235640 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.235819 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.235936 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.236067 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.236268 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.347208 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms25l\" (UniqueName: \"kubernetes.io/projected/9cab680a-f90c-4086-96f4-66c47ec4e497-kube-api-access-ms25l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p46lf\" (UID: \"9cab680a-f90c-4086-96f4-66c47ec4e497\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.347258 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cab680a-f90c-4086-96f4-66c47ec4e497-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p46lf\" (UID: \"9cab680a-f90c-4086-96f4-66c47ec4e497\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.347356 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9cab680a-f90c-4086-96f4-66c47ec4e497-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p46lf\" (UID: \"9cab680a-f90c-4086-96f4-66c47ec4e497\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.347379 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9cab680a-f90c-4086-96f4-66c47ec4e497-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p46lf\" (UID: \"9cab680a-f90c-4086-96f4-66c47ec4e497\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.449922 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9cab680a-f90c-4086-96f4-66c47ec4e497-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p46lf\" (UID: \"9cab680a-f90c-4086-96f4-66c47ec4e497\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.449976 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9cab680a-f90c-4086-96f4-66c47ec4e497-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p46lf\" (UID: \"9cab680a-f90c-4086-96f4-66c47ec4e497\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.450061 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms25l\" (UniqueName: \"kubernetes.io/projected/9cab680a-f90c-4086-96f4-66c47ec4e497-kube-api-access-ms25l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p46lf\" (UID: \"9cab680a-f90c-4086-96f4-66c47ec4e497\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.450123 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cab680a-f90c-4086-96f4-66c47ec4e497-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p46lf\" (UID: \"9cab680a-f90c-4086-96f4-66c47ec4e497\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.456790 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9cab680a-f90c-4086-96f4-66c47ec4e497-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p46lf\" (UID: \"9cab680a-f90c-4086-96f4-66c47ec4e497\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.457123 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9cab680a-f90c-4086-96f4-66c47ec4e497-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p46lf\" (UID: \"9cab680a-f90c-4086-96f4-66c47ec4e497\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.457744 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cab680a-f90c-4086-96f4-66c47ec4e497-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p46lf\" (UID: \"9cab680a-f90c-4086-96f4-66c47ec4e497\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.468977 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms25l\" (UniqueName: \"kubernetes.io/projected/9cab680a-f90c-4086-96f4-66c47ec4e497-kube-api-access-ms25l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p46lf\" (UID: \"9cab680a-f90c-4086-96f4-66c47ec4e497\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" Sep 30 14:19:36 crc kubenswrapper[4936]: I0930 14:19:36.549513 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" Sep 30 14:19:37 crc kubenswrapper[4936]: I0930 14:19:37.067136 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf"] Sep 30 14:19:37 crc kubenswrapper[4936]: I0930 14:19:37.074102 4936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:19:37 crc kubenswrapper[4936]: I0930 14:19:37.125880 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" event={"ID":"9cab680a-f90c-4086-96f4-66c47ec4e497","Type":"ContainerStarted","Data":"7a59f19b635fd551c933ee864de907d67f89a73b83fe8d314728909de52ce97f"} Sep 30 14:19:38 crc kubenswrapper[4936]: I0930 14:19:38.134328 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" event={"ID":"9cab680a-f90c-4086-96f4-66c47ec4e497","Type":"ContainerStarted","Data":"980bac731324b8539154aa4659d66ef66e9a3ba9f55d7e3a584be536378695a7"} Sep 30 14:19:38 crc kubenswrapper[4936]: I0930 14:19:38.159560 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" podStartSLOduration=1.7096489259999998 podStartE2EDuration="2.159543599s" podCreationTimestamp="2025-09-30 14:19:36 +0000 UTC" firstStartedPulling="2025-09-30 14:19:37.073667233 +0000 UTC m=+2427.457669534" lastFinishedPulling="2025-09-30 14:19:37.523561906 +0000 UTC m=+2427.907564207" observedRunningTime="2025-09-30 14:19:38.157294037 +0000 UTC m=+2428.541296348" watchObservedRunningTime="2025-09-30 14:19:38.159543599 +0000 UTC m=+2428.543545900" Sep 30 14:19:46 crc kubenswrapper[4936]: I0930 14:19:46.315846 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:19:46 crc kubenswrapper[4936]: E0930 14:19:46.316622 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:20:00 crc kubenswrapper[4936]: I0930 14:20:00.321556 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:20:00 crc kubenswrapper[4936]: E0930 14:20:00.322435 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:20:05 crc kubenswrapper[4936]: I0930 14:20:05.359415 4936 generic.go:334] "Generic (PLEG): container finished" podID="9cab680a-f90c-4086-96f4-66c47ec4e497" containerID="980bac731324b8539154aa4659d66ef66e9a3ba9f55d7e3a584be536378695a7" exitCode=0 Sep 30 14:20:05 crc kubenswrapper[4936]: I0930 14:20:05.359516 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" event={"ID":"9cab680a-f90c-4086-96f4-66c47ec4e497","Type":"ContainerDied","Data":"980bac731324b8539154aa4659d66ef66e9a3ba9f55d7e3a584be536378695a7"} Sep 30 14:20:06 crc kubenswrapper[4936]: I0930 14:20:06.765946 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" Sep 30 14:20:06 crc kubenswrapper[4936]: I0930 14:20:06.920127 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9cab680a-f90c-4086-96f4-66c47ec4e497-ssh-key\") pod \"9cab680a-f90c-4086-96f4-66c47ec4e497\" (UID: \"9cab680a-f90c-4086-96f4-66c47ec4e497\") " Sep 30 14:20:06 crc kubenswrapper[4936]: I0930 14:20:06.920298 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cab680a-f90c-4086-96f4-66c47ec4e497-inventory\") pod \"9cab680a-f90c-4086-96f4-66c47ec4e497\" (UID: \"9cab680a-f90c-4086-96f4-66c47ec4e497\") " Sep 30 14:20:06 crc kubenswrapper[4936]: I0930 14:20:06.920429 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms25l\" (UniqueName: \"kubernetes.io/projected/9cab680a-f90c-4086-96f4-66c47ec4e497-kube-api-access-ms25l\") pod \"9cab680a-f90c-4086-96f4-66c47ec4e497\" (UID: \"9cab680a-f90c-4086-96f4-66c47ec4e497\") " Sep 30 14:20:06 crc kubenswrapper[4936]: I0930 14:20:06.920492 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9cab680a-f90c-4086-96f4-66c47ec4e497-ceph\") pod \"9cab680a-f90c-4086-96f4-66c47ec4e497\" (UID: \"9cab680a-f90c-4086-96f4-66c47ec4e497\") " Sep 30 14:20:06 crc kubenswrapper[4936]: I0930 14:20:06.929543 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cab680a-f90c-4086-96f4-66c47ec4e497-ceph" (OuterVolumeSpecName: "ceph") pod "9cab680a-f90c-4086-96f4-66c47ec4e497" (UID: "9cab680a-f90c-4086-96f4-66c47ec4e497"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4936]: I0930 14:20:06.930560 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cab680a-f90c-4086-96f4-66c47ec4e497-kube-api-access-ms25l" (OuterVolumeSpecName: "kube-api-access-ms25l") pod "9cab680a-f90c-4086-96f4-66c47ec4e497" (UID: "9cab680a-f90c-4086-96f4-66c47ec4e497"). InnerVolumeSpecName "kube-api-access-ms25l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4936]: I0930 14:20:06.949008 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cab680a-f90c-4086-96f4-66c47ec4e497-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9cab680a-f90c-4086-96f4-66c47ec4e497" (UID: "9cab680a-f90c-4086-96f4-66c47ec4e497"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4936]: I0930 14:20:06.949864 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cab680a-f90c-4086-96f4-66c47ec4e497-inventory" (OuterVolumeSpecName: "inventory") pod "9cab680a-f90c-4086-96f4-66c47ec4e497" (UID: "9cab680a-f90c-4086-96f4-66c47ec4e497"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.022510 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cab680a-f90c-4086-96f4-66c47ec4e497-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.022543 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms25l\" (UniqueName: \"kubernetes.io/projected/9cab680a-f90c-4086-96f4-66c47ec4e497-kube-api-access-ms25l\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.022556 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9cab680a-f90c-4086-96f4-66c47ec4e497-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.022575 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9cab680a-f90c-4086-96f4-66c47ec4e497-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.375086 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" event={"ID":"9cab680a-f90c-4086-96f4-66c47ec4e497","Type":"ContainerDied","Data":"7a59f19b635fd551c933ee864de907d67f89a73b83fe8d314728909de52ce97f"} Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.375126 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a59f19b635fd551c933ee864de907d67f89a73b83fe8d314728909de52ce97f" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.375192 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p46lf" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.461176 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc"] Sep 30 14:20:07 crc kubenswrapper[4936]: E0930 14:20:07.461832 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cab680a-f90c-4086-96f4-66c47ec4e497" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.461851 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cab680a-f90c-4086-96f4-66c47ec4e497" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.462014 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cab680a-f90c-4086-96f4-66c47ec4e497" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.462617 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.465112 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.465152 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.465409 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.467477 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.475108 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc"] Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.475360 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.633377 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5817c4b-0566-4854-84c9-ad9a69b78172-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zskgc\" (UID: \"d5817c4b-0566-4854-84c9-ad9a69b78172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.633437 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5817c4b-0566-4854-84c9-ad9a69b78172-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zskgc\" (UID: \"d5817c4b-0566-4854-84c9-ad9a69b78172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.633456 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5817c4b-0566-4854-84c9-ad9a69b78172-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zskgc\" (UID: \"d5817c4b-0566-4854-84c9-ad9a69b78172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.633479 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsjcb\" (UniqueName: \"kubernetes.io/projected/d5817c4b-0566-4854-84c9-ad9a69b78172-kube-api-access-fsjcb\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zskgc\" (UID: \"d5817c4b-0566-4854-84c9-ad9a69b78172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.735605 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5817c4b-0566-4854-84c9-ad9a69b78172-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zskgc\" (UID: \"d5817c4b-0566-4854-84c9-ad9a69b78172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.735660 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5817c4b-0566-4854-84c9-ad9a69b78172-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zskgc\" (UID: \"d5817c4b-0566-4854-84c9-ad9a69b78172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.735686 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsjcb\" (UniqueName: \"kubernetes.io/projected/d5817c4b-0566-4854-84c9-ad9a69b78172-kube-api-access-fsjcb\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zskgc\" (UID: \"d5817c4b-0566-4854-84c9-ad9a69b78172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.735822 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5817c4b-0566-4854-84c9-ad9a69b78172-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zskgc\" (UID: \"d5817c4b-0566-4854-84c9-ad9a69b78172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.741922 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5817c4b-0566-4854-84c9-ad9a69b78172-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zskgc\" (UID: \"d5817c4b-0566-4854-84c9-ad9a69b78172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.741936 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5817c4b-0566-4854-84c9-ad9a69b78172-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zskgc\" (UID: \"d5817c4b-0566-4854-84c9-ad9a69b78172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.742271 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5817c4b-0566-4854-84c9-ad9a69b78172-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zskgc\" (UID: \"d5817c4b-0566-4854-84c9-ad9a69b78172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.750838 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsjcb\" (UniqueName: \"kubernetes.io/projected/d5817c4b-0566-4854-84c9-ad9a69b78172-kube-api-access-fsjcb\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zskgc\" (UID: \"d5817c4b-0566-4854-84c9-ad9a69b78172\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" Sep 30 14:20:07 crc kubenswrapper[4936]: I0930 14:20:07.795798 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" Sep 30 14:20:08 crc kubenswrapper[4936]: I0930 14:20:08.298964 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc"] Sep 30 14:20:08 crc kubenswrapper[4936]: I0930 14:20:08.385048 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" event={"ID":"d5817c4b-0566-4854-84c9-ad9a69b78172","Type":"ContainerStarted","Data":"ba99ea11e1eed0e048cff690df7f316ce108fd161be4ed7e962bdc54899eff07"} Sep 30 14:20:09 crc kubenswrapper[4936]: I0930 14:20:09.401796 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" event={"ID":"d5817c4b-0566-4854-84c9-ad9a69b78172","Type":"ContainerStarted","Data":"30dfe5e226dabfdf33bde195bea0527f41de7885b16699ccae7bc9759e47199a"} Sep 30 14:20:09 crc kubenswrapper[4936]: I0930 14:20:09.425731 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" podStartSLOduration=1.899299638 podStartE2EDuration="2.425713463s" podCreationTimestamp="2025-09-30 14:20:07 +0000 UTC" firstStartedPulling="2025-09-30 14:20:08.301816472 +0000 UTC m=+2458.685818773" lastFinishedPulling="2025-09-30 14:20:08.828230297 +0000 UTC m=+2459.212232598" observedRunningTime="2025-09-30 14:20:09.421843667 +0000 UTC m=+2459.805845968" watchObservedRunningTime="2025-09-30 14:20:09.425713463 +0000 UTC m=+2459.809715754" Sep 30 14:20:14 crc kubenswrapper[4936]: I0930 14:20:14.315562 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:20:14 crc kubenswrapper[4936]: E0930 14:20:14.316120 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:20:14 crc kubenswrapper[4936]: I0930 14:20:14.456686 4936 generic.go:334] "Generic (PLEG): container finished" podID="d5817c4b-0566-4854-84c9-ad9a69b78172" containerID="30dfe5e226dabfdf33bde195bea0527f41de7885b16699ccae7bc9759e47199a" exitCode=0 Sep 30 14:20:14 crc kubenswrapper[4936]: I0930 14:20:14.456730 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" event={"ID":"d5817c4b-0566-4854-84c9-ad9a69b78172","Type":"ContainerDied","Data":"30dfe5e226dabfdf33bde195bea0527f41de7885b16699ccae7bc9759e47199a"} Sep 30 14:20:15 crc kubenswrapper[4936]: I0930 14:20:15.857920 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" Sep 30 14:20:15 crc kubenswrapper[4936]: I0930 14:20:15.987739 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5817c4b-0566-4854-84c9-ad9a69b78172-ceph\") pod \"d5817c4b-0566-4854-84c9-ad9a69b78172\" (UID: \"d5817c4b-0566-4854-84c9-ad9a69b78172\") " Sep 30 14:20:15 crc kubenswrapper[4936]: I0930 14:20:15.987799 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsjcb\" (UniqueName: \"kubernetes.io/projected/d5817c4b-0566-4854-84c9-ad9a69b78172-kube-api-access-fsjcb\") pod \"d5817c4b-0566-4854-84c9-ad9a69b78172\" (UID: \"d5817c4b-0566-4854-84c9-ad9a69b78172\") " Sep 30 14:20:15 crc kubenswrapper[4936]: I0930 14:20:15.987823 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5817c4b-0566-4854-84c9-ad9a69b78172-inventory\") pod \"d5817c4b-0566-4854-84c9-ad9a69b78172\" (UID: \"d5817c4b-0566-4854-84c9-ad9a69b78172\") " Sep 30 14:20:15 crc kubenswrapper[4936]: I0930 14:20:15.987938 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5817c4b-0566-4854-84c9-ad9a69b78172-ssh-key\") pod \"d5817c4b-0566-4854-84c9-ad9a69b78172\" (UID: \"d5817c4b-0566-4854-84c9-ad9a69b78172\") " Sep 30 14:20:15 crc kubenswrapper[4936]: I0930 14:20:15.993850 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5817c4b-0566-4854-84c9-ad9a69b78172-ceph" (OuterVolumeSpecName: "ceph") pod "d5817c4b-0566-4854-84c9-ad9a69b78172" (UID: "d5817c4b-0566-4854-84c9-ad9a69b78172"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:15 crc kubenswrapper[4936]: I0930 14:20:15.993952 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5817c4b-0566-4854-84c9-ad9a69b78172-kube-api-access-fsjcb" (OuterVolumeSpecName: "kube-api-access-fsjcb") pod "d5817c4b-0566-4854-84c9-ad9a69b78172" (UID: "d5817c4b-0566-4854-84c9-ad9a69b78172"). InnerVolumeSpecName "kube-api-access-fsjcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.014105 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5817c4b-0566-4854-84c9-ad9a69b78172-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d5817c4b-0566-4854-84c9-ad9a69b78172" (UID: "d5817c4b-0566-4854-84c9-ad9a69b78172"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.019671 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5817c4b-0566-4854-84c9-ad9a69b78172-inventory" (OuterVolumeSpecName: "inventory") pod "d5817c4b-0566-4854-84c9-ad9a69b78172" (UID: "d5817c4b-0566-4854-84c9-ad9a69b78172"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.090885 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5817c4b-0566-4854-84c9-ad9a69b78172-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.091158 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsjcb\" (UniqueName: \"kubernetes.io/projected/d5817c4b-0566-4854-84c9-ad9a69b78172-kube-api-access-fsjcb\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.091223 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5817c4b-0566-4854-84c9-ad9a69b78172-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.091292 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5817c4b-0566-4854-84c9-ad9a69b78172-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.499728 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" event={"ID":"d5817c4b-0566-4854-84c9-ad9a69b78172","Type":"ContainerDied","Data":"ba99ea11e1eed0e048cff690df7f316ce108fd161be4ed7e962bdc54899eff07"} Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.499768 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba99ea11e1eed0e048cff690df7f316ce108fd161be4ed7e962bdc54899eff07" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.500156 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zskgc" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.553768 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl"] Sep 30 14:20:16 crc kubenswrapper[4936]: E0930 14:20:16.554464 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5817c4b-0566-4854-84c9-ad9a69b78172" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.554579 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5817c4b-0566-4854-84c9-ad9a69b78172" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.554896 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5817c4b-0566-4854-84c9-ad9a69b78172" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.555753 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.558118 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.558352 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.558548 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.558840 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.558964 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.562326 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl"] Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.703804 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73455681-e4b9-4313-991f-a00d4fab6d26-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6vjtl\" (UID: \"73455681-e4b9-4313-991f-a00d4fab6d26\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.704065 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73455681-e4b9-4313-991f-a00d4fab6d26-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6vjtl\" (UID: \"73455681-e4b9-4313-991f-a00d4fab6d26\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.704143 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73455681-e4b9-4313-991f-a00d4fab6d26-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6vjtl\" (UID: \"73455681-e4b9-4313-991f-a00d4fab6d26\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.704169 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9lwc\" (UniqueName: \"kubernetes.io/projected/73455681-e4b9-4313-991f-a00d4fab6d26-kube-api-access-f9lwc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6vjtl\" (UID: \"73455681-e4b9-4313-991f-a00d4fab6d26\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.805648 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73455681-e4b9-4313-991f-a00d4fab6d26-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6vjtl\" (UID: \"73455681-e4b9-4313-991f-a00d4fab6d26\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.805887 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73455681-e4b9-4313-991f-a00d4fab6d26-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6vjtl\" (UID: \"73455681-e4b9-4313-991f-a00d4fab6d26\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.806006 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9lwc\" (UniqueName: \"kubernetes.io/projected/73455681-e4b9-4313-991f-a00d4fab6d26-kube-api-access-f9lwc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6vjtl\" (UID: \"73455681-e4b9-4313-991f-a00d4fab6d26\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.806192 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73455681-e4b9-4313-991f-a00d4fab6d26-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6vjtl\" (UID: \"73455681-e4b9-4313-991f-a00d4fab6d26\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.811040 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73455681-e4b9-4313-991f-a00d4fab6d26-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6vjtl\" (UID: \"73455681-e4b9-4313-991f-a00d4fab6d26\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.811131 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73455681-e4b9-4313-991f-a00d4fab6d26-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6vjtl\" (UID: \"73455681-e4b9-4313-991f-a00d4fab6d26\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.812492 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73455681-e4b9-4313-991f-a00d4fab6d26-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6vjtl\" (UID: \"73455681-e4b9-4313-991f-a00d4fab6d26\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.823287 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9lwc\" (UniqueName: \"kubernetes.io/projected/73455681-e4b9-4313-991f-a00d4fab6d26-kube-api-access-f9lwc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6vjtl\" (UID: \"73455681-e4b9-4313-991f-a00d4fab6d26\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" Sep 30 14:20:16 crc kubenswrapper[4936]: I0930 14:20:16.886446 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" Sep 30 14:20:17 crc kubenswrapper[4936]: I0930 14:20:17.362930 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl"] Sep 30 14:20:17 crc kubenswrapper[4936]: I0930 14:20:17.507864 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" event={"ID":"73455681-e4b9-4313-991f-a00d4fab6d26","Type":"ContainerStarted","Data":"e0a880b293d5e5a46378c9ae2d28291cbac02ccede7f69aa010b9c8b18c60749"} Sep 30 14:20:18 crc kubenswrapper[4936]: I0930 14:20:18.516164 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" event={"ID":"73455681-e4b9-4313-991f-a00d4fab6d26","Type":"ContainerStarted","Data":"637989e65d5f18e9a8907b985a709b94f97f4201447727c0f4f564cb043ab731"} Sep 30 14:20:18 crc kubenswrapper[4936]: I0930 14:20:18.534803 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" podStartSLOduration=2.007526347 podStartE2EDuration="2.534783735s" podCreationTimestamp="2025-09-30 14:20:16 +0000 UTC" firstStartedPulling="2025-09-30 14:20:17.375881643 +0000 UTC m=+2467.759883944" lastFinishedPulling="2025-09-30 14:20:17.903139031 +0000 UTC m=+2468.287141332" observedRunningTime="2025-09-30 14:20:18.531592667 +0000 UTC m=+2468.915594968" watchObservedRunningTime="2025-09-30 14:20:18.534783735 +0000 UTC m=+2468.918786036" Sep 30 14:20:27 crc kubenswrapper[4936]: I0930 14:20:27.315495 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:20:27 crc kubenswrapper[4936]: E0930 14:20:27.316314 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:20:39 crc kubenswrapper[4936]: I0930 14:20:39.315531 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:20:39 crc kubenswrapper[4936]: E0930 14:20:39.316361 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:20:51 crc kubenswrapper[4936]: I0930 14:20:51.315462 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:20:51 crc kubenswrapper[4936]: E0930 14:20:51.316634 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:20:56 crc kubenswrapper[4936]: I0930 14:20:56.796374 4936 generic.go:334] "Generic (PLEG): container finished" podID="73455681-e4b9-4313-991f-a00d4fab6d26" containerID="637989e65d5f18e9a8907b985a709b94f97f4201447727c0f4f564cb043ab731" exitCode=0 Sep 30 14:20:56 crc kubenswrapper[4936]: I0930 14:20:56.796468 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" event={"ID":"73455681-e4b9-4313-991f-a00d4fab6d26","Type":"ContainerDied","Data":"637989e65d5f18e9a8907b985a709b94f97f4201447727c0f4f564cb043ab731"} Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.225039 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.334174 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73455681-e4b9-4313-991f-a00d4fab6d26-inventory\") pod \"73455681-e4b9-4313-991f-a00d4fab6d26\" (UID: \"73455681-e4b9-4313-991f-a00d4fab6d26\") " Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.334452 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9lwc\" (UniqueName: \"kubernetes.io/projected/73455681-e4b9-4313-991f-a00d4fab6d26-kube-api-access-f9lwc\") pod \"73455681-e4b9-4313-991f-a00d4fab6d26\" (UID: \"73455681-e4b9-4313-991f-a00d4fab6d26\") " Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.334503 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73455681-e4b9-4313-991f-a00d4fab6d26-ssh-key\") pod \"73455681-e4b9-4313-991f-a00d4fab6d26\" (UID: \"73455681-e4b9-4313-991f-a00d4fab6d26\") " Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.334585 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73455681-e4b9-4313-991f-a00d4fab6d26-ceph\") pod \"73455681-e4b9-4313-991f-a00d4fab6d26\" (UID: \"73455681-e4b9-4313-991f-a00d4fab6d26\") " Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.345143 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73455681-e4b9-4313-991f-a00d4fab6d26-ceph" (OuterVolumeSpecName: "ceph") pod "73455681-e4b9-4313-991f-a00d4fab6d26" (UID: "73455681-e4b9-4313-991f-a00d4fab6d26"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.345268 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73455681-e4b9-4313-991f-a00d4fab6d26-kube-api-access-f9lwc" (OuterVolumeSpecName: "kube-api-access-f9lwc") pod "73455681-e4b9-4313-991f-a00d4fab6d26" (UID: "73455681-e4b9-4313-991f-a00d4fab6d26"). InnerVolumeSpecName "kube-api-access-f9lwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.363437 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73455681-e4b9-4313-991f-a00d4fab6d26-inventory" (OuterVolumeSpecName: "inventory") pod "73455681-e4b9-4313-991f-a00d4fab6d26" (UID: "73455681-e4b9-4313-991f-a00d4fab6d26"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.379281 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73455681-e4b9-4313-991f-a00d4fab6d26-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "73455681-e4b9-4313-991f-a00d4fab6d26" (UID: "73455681-e4b9-4313-991f-a00d4fab6d26"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.440814 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9lwc\" (UniqueName: \"kubernetes.io/projected/73455681-e4b9-4313-991f-a00d4fab6d26-kube-api-access-f9lwc\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.440867 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73455681-e4b9-4313-991f-a00d4fab6d26-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.440966 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73455681-e4b9-4313-991f-a00d4fab6d26-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.440979 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73455681-e4b9-4313-991f-a00d4fab6d26-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.815675 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" event={"ID":"73455681-e4b9-4313-991f-a00d4fab6d26","Type":"ContainerDied","Data":"e0a880b293d5e5a46378c9ae2d28291cbac02ccede7f69aa010b9c8b18c60749"} Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.815718 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0a880b293d5e5a46378c9ae2d28291cbac02ccede7f69aa010b9c8b18c60749" Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.815750 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6vjtl" Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.950863 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q"] Sep 30 14:20:58 crc kubenswrapper[4936]: E0930 14:20:58.951282 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73455681-e4b9-4313-991f-a00d4fab6d26" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.951302 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="73455681-e4b9-4313-991f-a00d4fab6d26" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.951556 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="73455681-e4b9-4313-991f-a00d4fab6d26" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.952141 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q" Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.955787 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.956859 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.956967 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.957511 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.957598 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:20:58 crc kubenswrapper[4936]: I0930 14:20:58.977484 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q"] Sep 30 14:20:59 crc kubenswrapper[4936]: I0930 14:20:59.051829 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smcs2\" (UniqueName: \"kubernetes.io/projected/11eb924e-ede4-4f91-a053-946c9951cf0e-kube-api-access-smcs2\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q\" (UID: \"11eb924e-ede4-4f91-a053-946c9951cf0e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q" Sep 30 14:20:59 crc kubenswrapper[4936]: I0930 14:20:59.051891 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/11eb924e-ede4-4f91-a053-946c9951cf0e-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q\" (UID: \"11eb924e-ede4-4f91-a053-946c9951cf0e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q" Sep 30 14:20:59 crc kubenswrapper[4936]: I0930 14:20:59.052010 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11eb924e-ede4-4f91-a053-946c9951cf0e-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q\" (UID: \"11eb924e-ede4-4f91-a053-946c9951cf0e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q" Sep 30 14:20:59 crc kubenswrapper[4936]: I0930 14:20:59.052053 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11eb924e-ede4-4f91-a053-946c9951cf0e-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q\" (UID: \"11eb924e-ede4-4f91-a053-946c9951cf0e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q" Sep 30 14:20:59 crc kubenswrapper[4936]: I0930 14:20:59.153079 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smcs2\" (UniqueName: \"kubernetes.io/projected/11eb924e-ede4-4f91-a053-946c9951cf0e-kube-api-access-smcs2\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q\" (UID: \"11eb924e-ede4-4f91-a053-946c9951cf0e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q" Sep 30 14:20:59 crc kubenswrapper[4936]: I0930 14:20:59.153393 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/11eb924e-ede4-4f91-a053-946c9951cf0e-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q\" (UID: \"11eb924e-ede4-4f91-a053-946c9951cf0e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q" Sep 30 14:20:59 crc kubenswrapper[4936]: I0930 14:20:59.153433 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11eb924e-ede4-4f91-a053-946c9951cf0e-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q\" (UID: \"11eb924e-ede4-4f91-a053-946c9951cf0e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q" Sep 30 14:20:59 crc kubenswrapper[4936]: I0930 14:20:59.153453 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11eb924e-ede4-4f91-a053-946c9951cf0e-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q\" (UID: \"11eb924e-ede4-4f91-a053-946c9951cf0e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q" Sep 30 14:20:59 crc kubenswrapper[4936]: I0930 14:20:59.157055 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11eb924e-ede4-4f91-a053-946c9951cf0e-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q\" (UID: \"11eb924e-ede4-4f91-a053-946c9951cf0e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q" Sep 30 14:20:59 crc kubenswrapper[4936]: I0930 14:20:59.158816 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11eb924e-ede4-4f91-a053-946c9951cf0e-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q\" (UID: \"11eb924e-ede4-4f91-a053-946c9951cf0e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q" Sep 30 14:20:59 crc kubenswrapper[4936]: I0930 14:20:59.158965 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/11eb924e-ede4-4f91-a053-946c9951cf0e-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q\" (UID: \"11eb924e-ede4-4f91-a053-946c9951cf0e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q" Sep 30 14:20:59 crc kubenswrapper[4936]: I0930 14:20:59.172600 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smcs2\" (UniqueName: \"kubernetes.io/projected/11eb924e-ede4-4f91-a053-946c9951cf0e-kube-api-access-smcs2\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q\" (UID: \"11eb924e-ede4-4f91-a053-946c9951cf0e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q" Sep 30 14:20:59 crc kubenswrapper[4936]: I0930 14:20:59.268924 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q" Sep 30 14:20:59 crc kubenswrapper[4936]: I0930 14:20:59.796936 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q"] Sep 30 14:20:59 crc kubenswrapper[4936]: I0930 14:20:59.826147 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q" event={"ID":"11eb924e-ede4-4f91-a053-946c9951cf0e","Type":"ContainerStarted","Data":"31bbbd6e6b914aa7da577c2b7e9aeb46480a94eabea02374afca826eb6d5e778"} Sep 30 14:21:00 crc kubenswrapper[4936]: I0930 14:21:00.835312 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q" event={"ID":"11eb924e-ede4-4f91-a053-946c9951cf0e","Type":"ContainerStarted","Data":"58554db043bb115a0f40e1cdf9d49e7a2ce3d9589c39fdd12b8ae640e0753dbb"} Sep 30 14:21:03 crc kubenswrapper[4936]: I0930 14:21:03.315204 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:21:03 crc kubenswrapper[4936]: E0930 14:21:03.315753 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:21:04 crc kubenswrapper[4936]: I0930 14:21:04.866952 4936 generic.go:334] "Generic (PLEG): container finished" podID="11eb924e-ede4-4f91-a053-946c9951cf0e" containerID="58554db043bb115a0f40e1cdf9d49e7a2ce3d9589c39fdd12b8ae640e0753dbb" exitCode=0 Sep 30 14:21:04 crc kubenswrapper[4936]: I0930 14:21:04.867042 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q" event={"ID":"11eb924e-ede4-4f91-a053-946c9951cf0e","Type":"ContainerDied","Data":"58554db043bb115a0f40e1cdf9d49e7a2ce3d9589c39fdd12b8ae640e0753dbb"} Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.253632 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q" Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.394132 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11eb924e-ede4-4f91-a053-946c9951cf0e-ssh-key\") pod \"11eb924e-ede4-4f91-a053-946c9951cf0e\" (UID: \"11eb924e-ede4-4f91-a053-946c9951cf0e\") " Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.394237 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11eb924e-ede4-4f91-a053-946c9951cf0e-inventory\") pod \"11eb924e-ede4-4f91-a053-946c9951cf0e\" (UID: \"11eb924e-ede4-4f91-a053-946c9951cf0e\") " Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.394289 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smcs2\" (UniqueName: \"kubernetes.io/projected/11eb924e-ede4-4f91-a053-946c9951cf0e-kube-api-access-smcs2\") pod \"11eb924e-ede4-4f91-a053-946c9951cf0e\" (UID: \"11eb924e-ede4-4f91-a053-946c9951cf0e\") " Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.394317 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/11eb924e-ede4-4f91-a053-946c9951cf0e-ceph\") pod \"11eb924e-ede4-4f91-a053-946c9951cf0e\" (UID: \"11eb924e-ede4-4f91-a053-946c9951cf0e\") " Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.399850 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11eb924e-ede4-4f91-a053-946c9951cf0e-ceph" (OuterVolumeSpecName: "ceph") pod "11eb924e-ede4-4f91-a053-946c9951cf0e" (UID: "11eb924e-ede4-4f91-a053-946c9951cf0e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.400962 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11eb924e-ede4-4f91-a053-946c9951cf0e-kube-api-access-smcs2" (OuterVolumeSpecName: "kube-api-access-smcs2") pod "11eb924e-ede4-4f91-a053-946c9951cf0e" (UID: "11eb924e-ede4-4f91-a053-946c9951cf0e"). InnerVolumeSpecName "kube-api-access-smcs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.422913 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11eb924e-ede4-4f91-a053-946c9951cf0e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "11eb924e-ede4-4f91-a053-946c9951cf0e" (UID: "11eb924e-ede4-4f91-a053-946c9951cf0e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.428135 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11eb924e-ede4-4f91-a053-946c9951cf0e-inventory" (OuterVolumeSpecName: "inventory") pod "11eb924e-ede4-4f91-a053-946c9951cf0e" (UID: "11eb924e-ede4-4f91-a053-946c9951cf0e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.496907 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smcs2\" (UniqueName: \"kubernetes.io/projected/11eb924e-ede4-4f91-a053-946c9951cf0e-kube-api-access-smcs2\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.497155 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/11eb924e-ede4-4f91-a053-946c9951cf0e-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.497165 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11eb924e-ede4-4f91-a053-946c9951cf0e-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.497175 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11eb924e-ede4-4f91-a053-946c9951cf0e-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.882180 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q" event={"ID":"11eb924e-ede4-4f91-a053-946c9951cf0e","Type":"ContainerDied","Data":"31bbbd6e6b914aa7da577c2b7e9aeb46480a94eabea02374afca826eb6d5e778"} Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.882234 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31bbbd6e6b914aa7da577c2b7e9aeb46480a94eabea02374afca826eb6d5e778" Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.882241 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q" Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.981583 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2"] Sep 30 14:21:06 crc kubenswrapper[4936]: E0930 14:21:06.982316 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11eb924e-ede4-4f91-a053-946c9951cf0e" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.982367 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="11eb924e-ede4-4f91-a053-946c9951cf0e" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.982602 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="11eb924e-ede4-4f91-a053-946c9951cf0e" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.983328 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.985502 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.985910 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.986286 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.986540 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.986682 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:21:06 crc kubenswrapper[4936]: I0930 14:21:06.997312 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2"] Sep 30 14:21:07 crc kubenswrapper[4936]: I0930 14:21:07.106495 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbb2h\" (UniqueName: \"kubernetes.io/projected/ba550a84-9368-4cdb-8e5a-d474797cdd33-kube-api-access-bbb2h\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2\" (UID: \"ba550a84-9368-4cdb-8e5a-d474797cdd33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" Sep 30 14:21:07 crc kubenswrapper[4936]: I0930 14:21:07.106549 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba550a84-9368-4cdb-8e5a-d474797cdd33-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2\" (UID: \"ba550a84-9368-4cdb-8e5a-d474797cdd33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" Sep 30 14:21:07 crc kubenswrapper[4936]: I0930 14:21:07.106662 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba550a84-9368-4cdb-8e5a-d474797cdd33-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2\" (UID: \"ba550a84-9368-4cdb-8e5a-d474797cdd33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" Sep 30 14:21:07 crc kubenswrapper[4936]: I0930 14:21:07.106740 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba550a84-9368-4cdb-8e5a-d474797cdd33-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2\" (UID: \"ba550a84-9368-4cdb-8e5a-d474797cdd33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" Sep 30 14:21:07 crc kubenswrapper[4936]: I0930 14:21:07.208022 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba550a84-9368-4cdb-8e5a-d474797cdd33-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2\" (UID: \"ba550a84-9368-4cdb-8e5a-d474797cdd33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" Sep 30 14:21:07 crc kubenswrapper[4936]: I0930 14:21:07.208102 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba550a84-9368-4cdb-8e5a-d474797cdd33-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2\" (UID: \"ba550a84-9368-4cdb-8e5a-d474797cdd33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" Sep 30 14:21:07 crc kubenswrapper[4936]: I0930 14:21:07.208297 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbb2h\" (UniqueName: \"kubernetes.io/projected/ba550a84-9368-4cdb-8e5a-d474797cdd33-kube-api-access-bbb2h\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2\" (UID: \"ba550a84-9368-4cdb-8e5a-d474797cdd33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" Sep 30 14:21:07 crc kubenswrapper[4936]: I0930 14:21:07.208369 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba550a84-9368-4cdb-8e5a-d474797cdd33-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2\" (UID: \"ba550a84-9368-4cdb-8e5a-d474797cdd33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" Sep 30 14:21:07 crc kubenswrapper[4936]: I0930 14:21:07.212742 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba550a84-9368-4cdb-8e5a-d474797cdd33-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2\" (UID: \"ba550a84-9368-4cdb-8e5a-d474797cdd33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" Sep 30 14:21:07 crc kubenswrapper[4936]: I0930 14:21:07.214649 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba550a84-9368-4cdb-8e5a-d474797cdd33-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2\" (UID: \"ba550a84-9368-4cdb-8e5a-d474797cdd33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" Sep 30 14:21:07 crc kubenswrapper[4936]: I0930 14:21:07.215602 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba550a84-9368-4cdb-8e5a-d474797cdd33-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2\" (UID: \"ba550a84-9368-4cdb-8e5a-d474797cdd33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" Sep 30 14:21:07 crc kubenswrapper[4936]: I0930 14:21:07.226377 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbb2h\" (UniqueName: \"kubernetes.io/projected/ba550a84-9368-4cdb-8e5a-d474797cdd33-kube-api-access-bbb2h\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2\" (UID: \"ba550a84-9368-4cdb-8e5a-d474797cdd33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" Sep 30 14:21:07 crc kubenswrapper[4936]: I0930 14:21:07.316813 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" Sep 30 14:21:07 crc kubenswrapper[4936]: I0930 14:21:07.832890 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2"] Sep 30 14:21:07 crc kubenswrapper[4936]: I0930 14:21:07.890741 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" event={"ID":"ba550a84-9368-4cdb-8e5a-d474797cdd33","Type":"ContainerStarted","Data":"a0ce75cbd4299ab1de2eb708b10206f4b6f45ef0e809be444e5e255ecef15a8a"} Sep 30 14:21:08 crc kubenswrapper[4936]: I0930 14:21:08.900253 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" event={"ID":"ba550a84-9368-4cdb-8e5a-d474797cdd33","Type":"ContainerStarted","Data":"e8ee7b08d086cfa3bb60b8de29d6e8c309eadfd89a3fa49cc6234189ffe5713b"} Sep 30 14:21:08 crc kubenswrapper[4936]: I0930 14:21:08.918868 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" podStartSLOduration=2.520496827 podStartE2EDuration="2.918847735s" podCreationTimestamp="2025-09-30 14:21:06 +0000 UTC" firstStartedPulling="2025-09-30 14:21:07.843768175 +0000 UTC m=+2518.227770476" lastFinishedPulling="2025-09-30 14:21:08.242119083 +0000 UTC m=+2518.626121384" observedRunningTime="2025-09-30 14:21:08.915912355 +0000 UTC m=+2519.299914676" watchObservedRunningTime="2025-09-30 14:21:08.918847735 +0000 UTC m=+2519.302850046" Sep 30 14:21:18 crc kubenswrapper[4936]: I0930 14:21:18.316158 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:21:18 crc kubenswrapper[4936]: I0930 14:21:18.983103 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"bb1898fa06f7887a4f4e96d8737c5ec1fabdd689dd373ee13565538229cbbde3"} Sep 30 14:21:52 crc kubenswrapper[4936]: I0930 14:21:52.248842 4936 generic.go:334] "Generic (PLEG): container finished" podID="ba550a84-9368-4cdb-8e5a-d474797cdd33" containerID="e8ee7b08d086cfa3bb60b8de29d6e8c309eadfd89a3fa49cc6234189ffe5713b" exitCode=0 Sep 30 14:21:52 crc kubenswrapper[4936]: I0930 14:21:52.248916 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" event={"ID":"ba550a84-9368-4cdb-8e5a-d474797cdd33","Type":"ContainerDied","Data":"e8ee7b08d086cfa3bb60b8de29d6e8c309eadfd89a3fa49cc6234189ffe5713b"} Sep 30 14:21:53 crc kubenswrapper[4936]: I0930 14:21:53.709290 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" Sep 30 14:21:53 crc kubenswrapper[4936]: I0930 14:21:53.778832 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbb2h\" (UniqueName: \"kubernetes.io/projected/ba550a84-9368-4cdb-8e5a-d474797cdd33-kube-api-access-bbb2h\") pod \"ba550a84-9368-4cdb-8e5a-d474797cdd33\" (UID: \"ba550a84-9368-4cdb-8e5a-d474797cdd33\") " Sep 30 14:21:53 crc kubenswrapper[4936]: I0930 14:21:53.779642 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba550a84-9368-4cdb-8e5a-d474797cdd33-ssh-key\") pod \"ba550a84-9368-4cdb-8e5a-d474797cdd33\" (UID: \"ba550a84-9368-4cdb-8e5a-d474797cdd33\") " Sep 30 14:21:53 crc kubenswrapper[4936]: I0930 14:21:53.779669 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba550a84-9368-4cdb-8e5a-d474797cdd33-ceph\") pod \"ba550a84-9368-4cdb-8e5a-d474797cdd33\" (UID: \"ba550a84-9368-4cdb-8e5a-d474797cdd33\") " Sep 30 14:21:53 crc kubenswrapper[4936]: I0930 14:21:53.779770 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba550a84-9368-4cdb-8e5a-d474797cdd33-inventory\") pod \"ba550a84-9368-4cdb-8e5a-d474797cdd33\" (UID: \"ba550a84-9368-4cdb-8e5a-d474797cdd33\") " Sep 30 14:21:53 crc kubenswrapper[4936]: I0930 14:21:53.785514 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba550a84-9368-4cdb-8e5a-d474797cdd33-kube-api-access-bbb2h" (OuterVolumeSpecName: "kube-api-access-bbb2h") pod "ba550a84-9368-4cdb-8e5a-d474797cdd33" (UID: "ba550a84-9368-4cdb-8e5a-d474797cdd33"). InnerVolumeSpecName "kube-api-access-bbb2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:21:53 crc kubenswrapper[4936]: I0930 14:21:53.794615 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba550a84-9368-4cdb-8e5a-d474797cdd33-ceph" (OuterVolumeSpecName: "ceph") pod "ba550a84-9368-4cdb-8e5a-d474797cdd33" (UID: "ba550a84-9368-4cdb-8e5a-d474797cdd33"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:53 crc kubenswrapper[4936]: I0930 14:21:53.807617 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba550a84-9368-4cdb-8e5a-d474797cdd33-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ba550a84-9368-4cdb-8e5a-d474797cdd33" (UID: "ba550a84-9368-4cdb-8e5a-d474797cdd33"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:53 crc kubenswrapper[4936]: I0930 14:21:53.811759 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba550a84-9368-4cdb-8e5a-d474797cdd33-inventory" (OuterVolumeSpecName: "inventory") pod "ba550a84-9368-4cdb-8e5a-d474797cdd33" (UID: "ba550a84-9368-4cdb-8e5a-d474797cdd33"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:53 crc kubenswrapper[4936]: I0930 14:21:53.882552 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbb2h\" (UniqueName: \"kubernetes.io/projected/ba550a84-9368-4cdb-8e5a-d474797cdd33-kube-api-access-bbb2h\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:53 crc kubenswrapper[4936]: I0930 14:21:53.882625 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba550a84-9368-4cdb-8e5a-d474797cdd33-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:53 crc kubenswrapper[4936]: I0930 14:21:53.882639 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba550a84-9368-4cdb-8e5a-d474797cdd33-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:53 crc kubenswrapper[4936]: I0930 14:21:53.882651 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba550a84-9368-4cdb-8e5a-d474797cdd33-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.265776 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" event={"ID":"ba550a84-9368-4cdb-8e5a-d474797cdd33","Type":"ContainerDied","Data":"a0ce75cbd4299ab1de2eb708b10206f4b6f45ef0e809be444e5e255ecef15a8a"} Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.265817 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0ce75cbd4299ab1de2eb708b10206f4b6f45ef0e809be444e5e255ecef15a8a" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.265826 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.361521 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cnrw6"] Sep 30 14:21:54 crc kubenswrapper[4936]: E0930 14:21:54.362177 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba550a84-9368-4cdb-8e5a-d474797cdd33" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.362258 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba550a84-9368-4cdb-8e5a-d474797cdd33" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.362511 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba550a84-9368-4cdb-8e5a-d474797cdd33" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.363186 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.365687 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.365803 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.366047 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.366477 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.368974 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.381929 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cnrw6"] Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.493378 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05c30103-ea3f-41b7-82d1-73f43681e4e4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cnrw6\" (UID: \"05c30103-ea3f-41b7-82d1-73f43681e4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.493487 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/05c30103-ea3f-41b7-82d1-73f43681e4e4-ceph\") pod \"ssh-known-hosts-edpm-deployment-cnrw6\" (UID: \"05c30103-ea3f-41b7-82d1-73f43681e4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.493625 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppc69\" (UniqueName: \"kubernetes.io/projected/05c30103-ea3f-41b7-82d1-73f43681e4e4-kube-api-access-ppc69\") pod \"ssh-known-hosts-edpm-deployment-cnrw6\" (UID: \"05c30103-ea3f-41b7-82d1-73f43681e4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.493653 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/05c30103-ea3f-41b7-82d1-73f43681e4e4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cnrw6\" (UID: \"05c30103-ea3f-41b7-82d1-73f43681e4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.594957 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/05c30103-ea3f-41b7-82d1-73f43681e4e4-ceph\") pod \"ssh-known-hosts-edpm-deployment-cnrw6\" (UID: \"05c30103-ea3f-41b7-82d1-73f43681e4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.595132 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppc69\" (UniqueName: \"kubernetes.io/projected/05c30103-ea3f-41b7-82d1-73f43681e4e4-kube-api-access-ppc69\") pod \"ssh-known-hosts-edpm-deployment-cnrw6\" (UID: \"05c30103-ea3f-41b7-82d1-73f43681e4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.595170 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/05c30103-ea3f-41b7-82d1-73f43681e4e4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cnrw6\" (UID: \"05c30103-ea3f-41b7-82d1-73f43681e4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.595211 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05c30103-ea3f-41b7-82d1-73f43681e4e4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cnrw6\" (UID: \"05c30103-ea3f-41b7-82d1-73f43681e4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.599406 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/05c30103-ea3f-41b7-82d1-73f43681e4e4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cnrw6\" (UID: \"05c30103-ea3f-41b7-82d1-73f43681e4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.603980 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/05c30103-ea3f-41b7-82d1-73f43681e4e4-ceph\") pod \"ssh-known-hosts-edpm-deployment-cnrw6\" (UID: \"05c30103-ea3f-41b7-82d1-73f43681e4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.612349 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05c30103-ea3f-41b7-82d1-73f43681e4e4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cnrw6\" (UID: \"05c30103-ea3f-41b7-82d1-73f43681e4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.613091 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppc69\" (UniqueName: \"kubernetes.io/projected/05c30103-ea3f-41b7-82d1-73f43681e4e4-kube-api-access-ppc69\") pod \"ssh-known-hosts-edpm-deployment-cnrw6\" (UID: \"05c30103-ea3f-41b7-82d1-73f43681e4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" Sep 30 14:21:54 crc kubenswrapper[4936]: I0930 14:21:54.681989 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" Sep 30 14:21:55 crc kubenswrapper[4936]: I0930 14:21:55.201795 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cnrw6"] Sep 30 14:21:55 crc kubenswrapper[4936]: I0930 14:21:55.275746 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" event={"ID":"05c30103-ea3f-41b7-82d1-73f43681e4e4","Type":"ContainerStarted","Data":"faad87ebcb5ca22f918426351510ccdae7f54a977b0a7a0d1854254c0d988803"} Sep 30 14:21:56 crc kubenswrapper[4936]: I0930 14:21:56.289293 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" event={"ID":"05c30103-ea3f-41b7-82d1-73f43681e4e4","Type":"ContainerStarted","Data":"6cbd051c5b0a6edfef9de78e25375528dfbf5439323b21d780749f55d84d9a5b"} Sep 30 14:21:56 crc kubenswrapper[4936]: I0930 14:21:56.313669 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" podStartSLOduration=1.8493982070000001 podStartE2EDuration="2.313652355s" podCreationTimestamp="2025-09-30 14:21:54 +0000 UTC" firstStartedPulling="2025-09-30 14:21:55.207726327 +0000 UTC m=+2565.591728628" lastFinishedPulling="2025-09-30 14:21:55.671980475 +0000 UTC m=+2566.055982776" observedRunningTime="2025-09-30 14:21:56.308984587 +0000 UTC m=+2566.692986888" watchObservedRunningTime="2025-09-30 14:21:56.313652355 +0000 UTC m=+2566.697654656" Sep 30 14:22:05 crc kubenswrapper[4936]: I0930 14:22:05.369648 4936 generic.go:334] "Generic (PLEG): container finished" podID="05c30103-ea3f-41b7-82d1-73f43681e4e4" containerID="6cbd051c5b0a6edfef9de78e25375528dfbf5439323b21d780749f55d84d9a5b" exitCode=0 Sep 30 14:22:05 crc kubenswrapper[4936]: I0930 14:22:05.369742 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" event={"ID":"05c30103-ea3f-41b7-82d1-73f43681e4e4","Type":"ContainerDied","Data":"6cbd051c5b0a6edfef9de78e25375528dfbf5439323b21d780749f55d84d9a5b"} Sep 30 14:22:06 crc kubenswrapper[4936]: I0930 14:22:06.759542 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" Sep 30 14:22:06 crc kubenswrapper[4936]: I0930 14:22:06.821792 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/05c30103-ea3f-41b7-82d1-73f43681e4e4-ceph\") pod \"05c30103-ea3f-41b7-82d1-73f43681e4e4\" (UID: \"05c30103-ea3f-41b7-82d1-73f43681e4e4\") " Sep 30 14:22:06 crc kubenswrapper[4936]: I0930 14:22:06.821939 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/05c30103-ea3f-41b7-82d1-73f43681e4e4-inventory-0\") pod \"05c30103-ea3f-41b7-82d1-73f43681e4e4\" (UID: \"05c30103-ea3f-41b7-82d1-73f43681e4e4\") " Sep 30 14:22:06 crc kubenswrapper[4936]: I0930 14:22:06.821966 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05c30103-ea3f-41b7-82d1-73f43681e4e4-ssh-key-openstack-edpm-ipam\") pod \"05c30103-ea3f-41b7-82d1-73f43681e4e4\" (UID: \"05c30103-ea3f-41b7-82d1-73f43681e4e4\") " Sep 30 14:22:06 crc kubenswrapper[4936]: I0930 14:22:06.822045 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppc69\" (UniqueName: \"kubernetes.io/projected/05c30103-ea3f-41b7-82d1-73f43681e4e4-kube-api-access-ppc69\") pod \"05c30103-ea3f-41b7-82d1-73f43681e4e4\" (UID: \"05c30103-ea3f-41b7-82d1-73f43681e4e4\") " Sep 30 14:22:06 crc kubenswrapper[4936]: I0930 14:22:06.828074 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05c30103-ea3f-41b7-82d1-73f43681e4e4-ceph" (OuterVolumeSpecName: "ceph") pod "05c30103-ea3f-41b7-82d1-73f43681e4e4" (UID: "05c30103-ea3f-41b7-82d1-73f43681e4e4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:06 crc kubenswrapper[4936]: I0930 14:22:06.830213 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05c30103-ea3f-41b7-82d1-73f43681e4e4-kube-api-access-ppc69" (OuterVolumeSpecName: "kube-api-access-ppc69") pod "05c30103-ea3f-41b7-82d1-73f43681e4e4" (UID: "05c30103-ea3f-41b7-82d1-73f43681e4e4"). InnerVolumeSpecName "kube-api-access-ppc69". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:22:06 crc kubenswrapper[4936]: I0930 14:22:06.851431 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05c30103-ea3f-41b7-82d1-73f43681e4e4-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "05c30103-ea3f-41b7-82d1-73f43681e4e4" (UID: "05c30103-ea3f-41b7-82d1-73f43681e4e4"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:06 crc kubenswrapper[4936]: I0930 14:22:06.853576 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05c30103-ea3f-41b7-82d1-73f43681e4e4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "05c30103-ea3f-41b7-82d1-73f43681e4e4" (UID: "05c30103-ea3f-41b7-82d1-73f43681e4e4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:06 crc kubenswrapper[4936]: I0930 14:22:06.924318 4936 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/05c30103-ea3f-41b7-82d1-73f43681e4e4-inventory-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:06 crc kubenswrapper[4936]: I0930 14:22:06.924374 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05c30103-ea3f-41b7-82d1-73f43681e4e4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:06 crc kubenswrapper[4936]: I0930 14:22:06.924389 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppc69\" (UniqueName: \"kubernetes.io/projected/05c30103-ea3f-41b7-82d1-73f43681e4e4-kube-api-access-ppc69\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:06 crc kubenswrapper[4936]: I0930 14:22:06.924403 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/05c30103-ea3f-41b7-82d1-73f43681e4e4-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.387528 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" event={"ID":"05c30103-ea3f-41b7-82d1-73f43681e4e4","Type":"ContainerDied","Data":"faad87ebcb5ca22f918426351510ccdae7f54a977b0a7a0d1854254c0d988803"} Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.387574 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faad87ebcb5ca22f918426351510ccdae7f54a977b0a7a0d1854254c0d988803" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.387939 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cnrw6" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.473776 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct"] Sep 30 14:22:07 crc kubenswrapper[4936]: E0930 14:22:07.474260 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05c30103-ea3f-41b7-82d1-73f43681e4e4" containerName="ssh-known-hosts-edpm-deployment" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.474288 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="05c30103-ea3f-41b7-82d1-73f43681e4e4" containerName="ssh-known-hosts-edpm-deployment" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.476984 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="05c30103-ea3f-41b7-82d1-73f43681e4e4" containerName="ssh-known-hosts-edpm-deployment" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.477844 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.481881 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.486687 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.486745 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.487090 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.489987 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct"] Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.490396 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.535508 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f95c4be-ca65-49a0-90f5-9b36926fe423-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5jwct\" (UID: \"3f95c4be-ca65-49a0-90f5-9b36926fe423\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.535566 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txmsz\" (UniqueName: \"kubernetes.io/projected/3f95c4be-ca65-49a0-90f5-9b36926fe423-kube-api-access-txmsz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5jwct\" (UID: \"3f95c4be-ca65-49a0-90f5-9b36926fe423\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.535616 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f95c4be-ca65-49a0-90f5-9b36926fe423-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5jwct\" (UID: \"3f95c4be-ca65-49a0-90f5-9b36926fe423\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.535954 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f95c4be-ca65-49a0-90f5-9b36926fe423-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5jwct\" (UID: \"3f95c4be-ca65-49a0-90f5-9b36926fe423\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.637805 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f95c4be-ca65-49a0-90f5-9b36926fe423-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5jwct\" (UID: \"3f95c4be-ca65-49a0-90f5-9b36926fe423\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.637917 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f95c4be-ca65-49a0-90f5-9b36926fe423-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5jwct\" (UID: \"3f95c4be-ca65-49a0-90f5-9b36926fe423\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.637961 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txmsz\" (UniqueName: \"kubernetes.io/projected/3f95c4be-ca65-49a0-90f5-9b36926fe423-kube-api-access-txmsz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5jwct\" (UID: \"3f95c4be-ca65-49a0-90f5-9b36926fe423\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.638040 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f95c4be-ca65-49a0-90f5-9b36926fe423-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5jwct\" (UID: \"3f95c4be-ca65-49a0-90f5-9b36926fe423\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.642936 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f95c4be-ca65-49a0-90f5-9b36926fe423-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5jwct\" (UID: \"3f95c4be-ca65-49a0-90f5-9b36926fe423\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.646059 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f95c4be-ca65-49a0-90f5-9b36926fe423-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5jwct\" (UID: \"3f95c4be-ca65-49a0-90f5-9b36926fe423\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.646921 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f95c4be-ca65-49a0-90f5-9b36926fe423-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5jwct\" (UID: \"3f95c4be-ca65-49a0-90f5-9b36926fe423\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.665985 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txmsz\" (UniqueName: \"kubernetes.io/projected/3f95c4be-ca65-49a0-90f5-9b36926fe423-kube-api-access-txmsz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5jwct\" (UID: \"3f95c4be-ca65-49a0-90f5-9b36926fe423\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct" Sep 30 14:22:07 crc kubenswrapper[4936]: I0930 14:22:07.814398 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct" Sep 30 14:22:08 crc kubenswrapper[4936]: I0930 14:22:08.329583 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct"] Sep 30 14:22:08 crc kubenswrapper[4936]: W0930 14:22:08.333432 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f95c4be_ca65_49a0_90f5_9b36926fe423.slice/crio-e1ccdaa3d91e7326984e85af54d5d5bfcddeb38ee45bb7694a0627397885d4cf WatchSource:0}: Error finding container e1ccdaa3d91e7326984e85af54d5d5bfcddeb38ee45bb7694a0627397885d4cf: Status 404 returned error can't find the container with id e1ccdaa3d91e7326984e85af54d5d5bfcddeb38ee45bb7694a0627397885d4cf Sep 30 14:22:08 crc kubenswrapper[4936]: I0930 14:22:08.397418 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct" event={"ID":"3f95c4be-ca65-49a0-90f5-9b36926fe423","Type":"ContainerStarted","Data":"e1ccdaa3d91e7326984e85af54d5d5bfcddeb38ee45bb7694a0627397885d4cf"} Sep 30 14:22:09 crc kubenswrapper[4936]: I0930 14:22:09.416048 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct" event={"ID":"3f95c4be-ca65-49a0-90f5-9b36926fe423","Type":"ContainerStarted","Data":"72a575188d9b71a365ab8773ad9c7a2aa1326f13fba1a8cdbe9a34955afe6efd"} Sep 30 14:22:17 crc kubenswrapper[4936]: I0930 14:22:17.508277 4936 generic.go:334] "Generic (PLEG): container finished" podID="3f95c4be-ca65-49a0-90f5-9b36926fe423" containerID="72a575188d9b71a365ab8773ad9c7a2aa1326f13fba1a8cdbe9a34955afe6efd" exitCode=0 Sep 30 14:22:17 crc kubenswrapper[4936]: I0930 14:22:17.508809 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct" event={"ID":"3f95c4be-ca65-49a0-90f5-9b36926fe423","Type":"ContainerDied","Data":"72a575188d9b71a365ab8773ad9c7a2aa1326f13fba1a8cdbe9a34955afe6efd"} Sep 30 14:22:18 crc kubenswrapper[4936]: I0930 14:22:18.908753 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct" Sep 30 14:22:18 crc kubenswrapper[4936]: I0930 14:22:18.955744 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f95c4be-ca65-49a0-90f5-9b36926fe423-inventory\") pod \"3f95c4be-ca65-49a0-90f5-9b36926fe423\" (UID: \"3f95c4be-ca65-49a0-90f5-9b36926fe423\") " Sep 30 14:22:18 crc kubenswrapper[4936]: I0930 14:22:18.956226 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txmsz\" (UniqueName: \"kubernetes.io/projected/3f95c4be-ca65-49a0-90f5-9b36926fe423-kube-api-access-txmsz\") pod \"3f95c4be-ca65-49a0-90f5-9b36926fe423\" (UID: \"3f95c4be-ca65-49a0-90f5-9b36926fe423\") " Sep 30 14:22:18 crc kubenswrapper[4936]: I0930 14:22:18.956262 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f95c4be-ca65-49a0-90f5-9b36926fe423-ceph\") pod \"3f95c4be-ca65-49a0-90f5-9b36926fe423\" (UID: \"3f95c4be-ca65-49a0-90f5-9b36926fe423\") " Sep 30 14:22:18 crc kubenswrapper[4936]: I0930 14:22:18.956316 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f95c4be-ca65-49a0-90f5-9b36926fe423-ssh-key\") pod \"3f95c4be-ca65-49a0-90f5-9b36926fe423\" (UID: \"3f95c4be-ca65-49a0-90f5-9b36926fe423\") " Sep 30 14:22:18 crc kubenswrapper[4936]: I0930 14:22:18.962778 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f95c4be-ca65-49a0-90f5-9b36926fe423-ceph" (OuterVolumeSpecName: "ceph") pod "3f95c4be-ca65-49a0-90f5-9b36926fe423" (UID: "3f95c4be-ca65-49a0-90f5-9b36926fe423"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:18 crc kubenswrapper[4936]: I0930 14:22:18.966960 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f95c4be-ca65-49a0-90f5-9b36926fe423-kube-api-access-txmsz" (OuterVolumeSpecName: "kube-api-access-txmsz") pod "3f95c4be-ca65-49a0-90f5-9b36926fe423" (UID: "3f95c4be-ca65-49a0-90f5-9b36926fe423"). InnerVolumeSpecName "kube-api-access-txmsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:22:18 crc kubenswrapper[4936]: I0930 14:22:18.986274 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f95c4be-ca65-49a0-90f5-9b36926fe423-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3f95c4be-ca65-49a0-90f5-9b36926fe423" (UID: "3f95c4be-ca65-49a0-90f5-9b36926fe423"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:18 crc kubenswrapper[4936]: I0930 14:22:18.987382 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f95c4be-ca65-49a0-90f5-9b36926fe423-inventory" (OuterVolumeSpecName: "inventory") pod "3f95c4be-ca65-49a0-90f5-9b36926fe423" (UID: "3f95c4be-ca65-49a0-90f5-9b36926fe423"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.059066 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f95c4be-ca65-49a0-90f5-9b36926fe423-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.059104 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txmsz\" (UniqueName: \"kubernetes.io/projected/3f95c4be-ca65-49a0-90f5-9b36926fe423-kube-api-access-txmsz\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.059115 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f95c4be-ca65-49a0-90f5-9b36926fe423-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.059123 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f95c4be-ca65-49a0-90f5-9b36926fe423-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.525098 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct" event={"ID":"3f95c4be-ca65-49a0-90f5-9b36926fe423","Type":"ContainerDied","Data":"e1ccdaa3d91e7326984e85af54d5d5bfcddeb38ee45bb7694a0627397885d4cf"} Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.525146 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1ccdaa3d91e7326984e85af54d5d5bfcddeb38ee45bb7694a0627397885d4cf" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.525194 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5jwct" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.604242 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2"] Sep 30 14:22:19 crc kubenswrapper[4936]: E0930 14:22:19.604587 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f95c4be-ca65-49a0-90f5-9b36926fe423" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.604604 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f95c4be-ca65-49a0-90f5-9b36926fe423" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.604807 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f95c4be-ca65-49a0-90f5-9b36926fe423" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.605374 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.608199 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.608576 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.608835 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.609023 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.609254 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.620632 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2"] Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.673625 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtjzz\" (UniqueName: \"kubernetes.io/projected/d8b066d6-19ec-4267-8793-cfe95b74624f-kube-api-access-mtjzz\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2\" (UID: \"d8b066d6-19ec-4267-8793-cfe95b74624f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.673689 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8b066d6-19ec-4267-8793-cfe95b74624f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2\" (UID: \"d8b066d6-19ec-4267-8793-cfe95b74624f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.673783 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d8b066d6-19ec-4267-8793-cfe95b74624f-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2\" (UID: \"d8b066d6-19ec-4267-8793-cfe95b74624f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.673820 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8b066d6-19ec-4267-8793-cfe95b74624f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2\" (UID: \"d8b066d6-19ec-4267-8793-cfe95b74624f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.775050 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8b066d6-19ec-4267-8793-cfe95b74624f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2\" (UID: \"d8b066d6-19ec-4267-8793-cfe95b74624f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.775429 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtjzz\" (UniqueName: \"kubernetes.io/projected/d8b066d6-19ec-4267-8793-cfe95b74624f-kube-api-access-mtjzz\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2\" (UID: \"d8b066d6-19ec-4267-8793-cfe95b74624f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.775579 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8b066d6-19ec-4267-8793-cfe95b74624f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2\" (UID: \"d8b066d6-19ec-4267-8793-cfe95b74624f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.775726 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d8b066d6-19ec-4267-8793-cfe95b74624f-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2\" (UID: \"d8b066d6-19ec-4267-8793-cfe95b74624f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.781185 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8b066d6-19ec-4267-8793-cfe95b74624f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2\" (UID: \"d8b066d6-19ec-4267-8793-cfe95b74624f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.781659 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8b066d6-19ec-4267-8793-cfe95b74624f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2\" (UID: \"d8b066d6-19ec-4267-8793-cfe95b74624f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.782070 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d8b066d6-19ec-4267-8793-cfe95b74624f-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2\" (UID: \"d8b066d6-19ec-4267-8793-cfe95b74624f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.796437 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtjzz\" (UniqueName: \"kubernetes.io/projected/d8b066d6-19ec-4267-8793-cfe95b74624f-kube-api-access-mtjzz\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2\" (UID: \"d8b066d6-19ec-4267-8793-cfe95b74624f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" Sep 30 14:22:19 crc kubenswrapper[4936]: I0930 14:22:19.975245 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" Sep 30 14:22:20 crc kubenswrapper[4936]: I0930 14:22:20.511813 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2"] Sep 30 14:22:20 crc kubenswrapper[4936]: I0930 14:22:20.535966 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" event={"ID":"d8b066d6-19ec-4267-8793-cfe95b74624f","Type":"ContainerStarted","Data":"5daf24fa8ede94694d5f10d0bf93a60dc8d79ca1effd7693fa2e86ebb09a1788"} Sep 30 14:22:21 crc kubenswrapper[4936]: I0930 14:22:21.547471 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" event={"ID":"d8b066d6-19ec-4267-8793-cfe95b74624f","Type":"ContainerStarted","Data":"539efbf5fd6fbba718895e98884e6c0f83a0e3e2f6d1664b9982ba512108c3fe"} Sep 30 14:22:21 crc kubenswrapper[4936]: I0930 14:22:21.567982 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" podStartSLOduration=1.8341639509999998 podStartE2EDuration="2.567961201s" podCreationTimestamp="2025-09-30 14:22:19 +0000 UTC" firstStartedPulling="2025-09-30 14:22:20.525480265 +0000 UTC m=+2590.909482566" lastFinishedPulling="2025-09-30 14:22:21.259277515 +0000 UTC m=+2591.643279816" observedRunningTime="2025-09-30 14:22:21.564513146 +0000 UTC m=+2591.948515437" watchObservedRunningTime="2025-09-30 14:22:21.567961201 +0000 UTC m=+2591.951963522" Sep 30 14:22:31 crc kubenswrapper[4936]: I0930 14:22:31.632937 4936 generic.go:334] "Generic (PLEG): container finished" podID="d8b066d6-19ec-4267-8793-cfe95b74624f" containerID="539efbf5fd6fbba718895e98884e6c0f83a0e3e2f6d1664b9982ba512108c3fe" exitCode=0 Sep 30 14:22:31 crc kubenswrapper[4936]: I0930 14:22:31.632969 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" event={"ID":"d8b066d6-19ec-4267-8793-cfe95b74624f","Type":"ContainerDied","Data":"539efbf5fd6fbba718895e98884e6c0f83a0e3e2f6d1664b9982ba512108c3fe"} Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.022361 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.095735 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtjzz\" (UniqueName: \"kubernetes.io/projected/d8b066d6-19ec-4267-8793-cfe95b74624f-kube-api-access-mtjzz\") pod \"d8b066d6-19ec-4267-8793-cfe95b74624f\" (UID: \"d8b066d6-19ec-4267-8793-cfe95b74624f\") " Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.095803 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8b066d6-19ec-4267-8793-cfe95b74624f-inventory\") pod \"d8b066d6-19ec-4267-8793-cfe95b74624f\" (UID: \"d8b066d6-19ec-4267-8793-cfe95b74624f\") " Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.095895 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d8b066d6-19ec-4267-8793-cfe95b74624f-ceph\") pod \"d8b066d6-19ec-4267-8793-cfe95b74624f\" (UID: \"d8b066d6-19ec-4267-8793-cfe95b74624f\") " Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.096045 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8b066d6-19ec-4267-8793-cfe95b74624f-ssh-key\") pod \"d8b066d6-19ec-4267-8793-cfe95b74624f\" (UID: \"d8b066d6-19ec-4267-8793-cfe95b74624f\") " Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.118498 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8b066d6-19ec-4267-8793-cfe95b74624f-kube-api-access-mtjzz" (OuterVolumeSpecName: "kube-api-access-mtjzz") pod "d8b066d6-19ec-4267-8793-cfe95b74624f" (UID: "d8b066d6-19ec-4267-8793-cfe95b74624f"). InnerVolumeSpecName "kube-api-access-mtjzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.139651 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8b066d6-19ec-4267-8793-cfe95b74624f-ceph" (OuterVolumeSpecName: "ceph") pod "d8b066d6-19ec-4267-8793-cfe95b74624f" (UID: "d8b066d6-19ec-4267-8793-cfe95b74624f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.157421 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8b066d6-19ec-4267-8793-cfe95b74624f-inventory" (OuterVolumeSpecName: "inventory") pod "d8b066d6-19ec-4267-8793-cfe95b74624f" (UID: "d8b066d6-19ec-4267-8793-cfe95b74624f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.172954 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8b066d6-19ec-4267-8793-cfe95b74624f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d8b066d6-19ec-4267-8793-cfe95b74624f" (UID: "d8b066d6-19ec-4267-8793-cfe95b74624f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.197455 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8b066d6-19ec-4267-8793-cfe95b74624f-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.197482 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtjzz\" (UniqueName: \"kubernetes.io/projected/d8b066d6-19ec-4267-8793-cfe95b74624f-kube-api-access-mtjzz\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.197493 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8b066d6-19ec-4267-8793-cfe95b74624f-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.197501 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d8b066d6-19ec-4267-8793-cfe95b74624f-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.652483 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" event={"ID":"d8b066d6-19ec-4267-8793-cfe95b74624f","Type":"ContainerDied","Data":"5daf24fa8ede94694d5f10d0bf93a60dc8d79ca1effd7693fa2e86ebb09a1788"} Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.652819 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5daf24fa8ede94694d5f10d0bf93a60dc8d79ca1effd7693fa2e86ebb09a1788" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.652752 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.750871 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw"] Sep 30 14:22:33 crc kubenswrapper[4936]: E0930 14:22:33.751555 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b066d6-19ec-4267-8793-cfe95b74624f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.751653 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b066d6-19ec-4267-8793-cfe95b74624f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.751917 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8b066d6-19ec-4267-8793-cfe95b74624f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.752577 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.771770 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw"] Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.779418 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.779483 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.779924 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.779977 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.780033 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.780077 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.780118 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.780918 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.808097 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.808201 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.808260 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.808289 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.808376 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.808450 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.808484 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.808565 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.808639 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.808692 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.808791 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.808865 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.809025 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nsb4\" (UniqueName: \"kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-kube-api-access-2nsb4\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.909877 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.909926 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.909971 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.909998 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.910019 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.910053 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.910076 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.910120 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nsb4\" (UniqueName: \"kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-kube-api-access-2nsb4\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.910159 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.910185 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.910202 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.910220 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.910242 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.917254 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.917260 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.917361 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.917527 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.918450 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.918662 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.918683 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.918717 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.919561 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.919921 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.920451 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.920630 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:33 crc kubenswrapper[4936]: I0930 14:22:33.925723 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nsb4\" (UniqueName: \"kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-kube-api-access-2nsb4\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r27jw\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:34 crc kubenswrapper[4936]: I0930 14:22:34.087756 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:22:34 crc kubenswrapper[4936]: I0930 14:22:34.602772 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw"] Sep 30 14:22:34 crc kubenswrapper[4936]: W0930 14:22:34.612656 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c5c3ed5_0905_48db_aab6_0d2489fc7d42.slice/crio-2b5d7ee4f1e77e39037a7ee1957ea56cf07e472980b37a9bfda8fe995b4ba0a2 WatchSource:0}: Error finding container 2b5d7ee4f1e77e39037a7ee1957ea56cf07e472980b37a9bfda8fe995b4ba0a2: Status 404 returned error can't find the container with id 2b5d7ee4f1e77e39037a7ee1957ea56cf07e472980b37a9bfda8fe995b4ba0a2 Sep 30 14:22:34 crc kubenswrapper[4936]: I0930 14:22:34.663315 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" event={"ID":"9c5c3ed5-0905-48db-aab6-0d2489fc7d42","Type":"ContainerStarted","Data":"2b5d7ee4f1e77e39037a7ee1957ea56cf07e472980b37a9bfda8fe995b4ba0a2"} Sep 30 14:22:35 crc kubenswrapper[4936]: I0930 14:22:35.676153 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" event={"ID":"9c5c3ed5-0905-48db-aab6-0d2489fc7d42","Type":"ContainerStarted","Data":"cbb39e5198cc6c022f5996ecb8517ca09ffc43068925fca638fe410efabedc06"} Sep 30 14:22:35 crc kubenswrapper[4936]: I0930 14:22:35.696083 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" podStartSLOduration=2.257391163 podStartE2EDuration="2.696067689s" podCreationTimestamp="2025-09-30 14:22:33 +0000 UTC" firstStartedPulling="2025-09-30 14:22:34.61559026 +0000 UTC m=+2604.999592561" lastFinishedPulling="2025-09-30 14:22:35.054266786 +0000 UTC m=+2605.438269087" observedRunningTime="2025-09-30 14:22:35.692918692 +0000 UTC m=+2606.076921003" watchObservedRunningTime="2025-09-30 14:22:35.696067689 +0000 UTC m=+2606.080069990" Sep 30 14:23:06 crc kubenswrapper[4936]: I0930 14:23:06.944930 4936 generic.go:334] "Generic (PLEG): container finished" podID="9c5c3ed5-0905-48db-aab6-0d2489fc7d42" containerID="cbb39e5198cc6c022f5996ecb8517ca09ffc43068925fca638fe410efabedc06" exitCode=0 Sep 30 14:23:06 crc kubenswrapper[4936]: I0930 14:23:06.945528 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" event={"ID":"9c5c3ed5-0905-48db-aab6-0d2489fc7d42","Type":"ContainerDied","Data":"cbb39e5198cc6c022f5996ecb8517ca09ffc43068925fca638fe410efabedc06"} Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.426920 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.552997 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-inventory\") pod \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.553076 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-ssh-key\") pod \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.553101 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nsb4\" (UniqueName: \"kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-kube-api-access-2nsb4\") pod \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.553151 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-ovn-combined-ca-bundle\") pod \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.553175 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-bootstrap-combined-ca-bundle\") pod \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.553207 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.553246 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-neutron-metadata-combined-ca-bundle\") pod \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.553284 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-repo-setup-combined-ca-bundle\") pod \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.553357 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-ceph\") pod \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.553410 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-nova-combined-ca-bundle\") pod \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.553478 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.553561 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-libvirt-combined-ca-bundle\") pod \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.553619 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-openstack-edpm-ipam-ovn-default-certs-0\") pod \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\" (UID: \"9c5c3ed5-0905-48db-aab6-0d2489fc7d42\") " Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.561097 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9c5c3ed5-0905-48db-aab6-0d2489fc7d42" (UID: "9c5c3ed5-0905-48db-aab6-0d2489fc7d42"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.561150 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-kube-api-access-2nsb4" (OuterVolumeSpecName: "kube-api-access-2nsb4") pod "9c5c3ed5-0905-48db-aab6-0d2489fc7d42" (UID: "9c5c3ed5-0905-48db-aab6-0d2489fc7d42"). InnerVolumeSpecName "kube-api-access-2nsb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.562163 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "9c5c3ed5-0905-48db-aab6-0d2489fc7d42" (UID: "9c5c3ed5-0905-48db-aab6-0d2489fc7d42"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.564226 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9c5c3ed5-0905-48db-aab6-0d2489fc7d42" (UID: "9c5c3ed5-0905-48db-aab6-0d2489fc7d42"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.564641 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9c5c3ed5-0905-48db-aab6-0d2489fc7d42" (UID: "9c5c3ed5-0905-48db-aab6-0d2489fc7d42"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.565009 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "9c5c3ed5-0905-48db-aab6-0d2489fc7d42" (UID: "9c5c3ed5-0905-48db-aab6-0d2489fc7d42"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.565165 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "9c5c3ed5-0905-48db-aab6-0d2489fc7d42" (UID: "9c5c3ed5-0905-48db-aab6-0d2489fc7d42"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.565684 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9c5c3ed5-0905-48db-aab6-0d2489fc7d42" (UID: "9c5c3ed5-0905-48db-aab6-0d2489fc7d42"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.566206 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "9c5c3ed5-0905-48db-aab6-0d2489fc7d42" (UID: "9c5c3ed5-0905-48db-aab6-0d2489fc7d42"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.570659 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9c5c3ed5-0905-48db-aab6-0d2489fc7d42" (UID: "9c5c3ed5-0905-48db-aab6-0d2489fc7d42"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.571533 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-ceph" (OuterVolumeSpecName: "ceph") pod "9c5c3ed5-0905-48db-aab6-0d2489fc7d42" (UID: "9c5c3ed5-0905-48db-aab6-0d2489fc7d42"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.589204 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-inventory" (OuterVolumeSpecName: "inventory") pod "9c5c3ed5-0905-48db-aab6-0d2489fc7d42" (UID: "9c5c3ed5-0905-48db-aab6-0d2489fc7d42"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.592992 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9c5c3ed5-0905-48db-aab6-0d2489fc7d42" (UID: "9c5c3ed5-0905-48db-aab6-0d2489fc7d42"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.656064 4936 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.656118 4936 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.656135 4936 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.656151 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.656164 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.656176 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nsb4\" (UniqueName: \"kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-kube-api-access-2nsb4\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.656192 4936 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.656204 4936 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.656219 4936 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.656235 4936 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.656252 4936 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.656269 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.656280 4936 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5c3ed5-0905-48db-aab6-0d2489fc7d42-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.963190 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" event={"ID":"9c5c3ed5-0905-48db-aab6-0d2489fc7d42","Type":"ContainerDied","Data":"2b5d7ee4f1e77e39037a7ee1957ea56cf07e472980b37a9bfda8fe995b4ba0a2"} Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.963235 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b5d7ee4f1e77e39037a7ee1957ea56cf07e472980b37a9bfda8fe995b4ba0a2" Sep 30 14:23:08 crc kubenswrapper[4936]: I0930 14:23:08.963258 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r27jw" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.103626 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp"] Sep 30 14:23:09 crc kubenswrapper[4936]: E0930 14:23:09.104014 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c5c3ed5-0905-48db-aab6-0d2489fc7d42" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.104034 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c5c3ed5-0905-48db-aab6-0d2489fc7d42" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.104474 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c5c3ed5-0905-48db-aab6-0d2489fc7d42" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.105106 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.112846 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.113280 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.113442 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.120089 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.123624 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp"] Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.126440 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.269287 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12c8fabd-e2f8-4073-af4a-21fde9a45d85-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp\" (UID: \"12c8fabd-e2f8-4073-af4a-21fde9a45d85\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.269695 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/12c8fabd-e2f8-4073-af4a-21fde9a45d85-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp\" (UID: \"12c8fabd-e2f8-4073-af4a-21fde9a45d85\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.269752 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12c8fabd-e2f8-4073-af4a-21fde9a45d85-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp\" (UID: \"12c8fabd-e2f8-4073-af4a-21fde9a45d85\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.269788 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp568\" (UniqueName: \"kubernetes.io/projected/12c8fabd-e2f8-4073-af4a-21fde9a45d85-kube-api-access-bp568\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp\" (UID: \"12c8fabd-e2f8-4073-af4a-21fde9a45d85\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.372988 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12c8fabd-e2f8-4073-af4a-21fde9a45d85-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp\" (UID: \"12c8fabd-e2f8-4073-af4a-21fde9a45d85\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.373652 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp568\" (UniqueName: \"kubernetes.io/projected/12c8fabd-e2f8-4073-af4a-21fde9a45d85-kube-api-access-bp568\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp\" (UID: \"12c8fabd-e2f8-4073-af4a-21fde9a45d85\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.373994 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12c8fabd-e2f8-4073-af4a-21fde9a45d85-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp\" (UID: \"12c8fabd-e2f8-4073-af4a-21fde9a45d85\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.374265 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/12c8fabd-e2f8-4073-af4a-21fde9a45d85-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp\" (UID: \"12c8fabd-e2f8-4073-af4a-21fde9a45d85\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.380428 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12c8fabd-e2f8-4073-af4a-21fde9a45d85-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp\" (UID: \"12c8fabd-e2f8-4073-af4a-21fde9a45d85\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.380494 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12c8fabd-e2f8-4073-af4a-21fde9a45d85-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp\" (UID: \"12c8fabd-e2f8-4073-af4a-21fde9a45d85\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.387940 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/12c8fabd-e2f8-4073-af4a-21fde9a45d85-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp\" (UID: \"12c8fabd-e2f8-4073-af4a-21fde9a45d85\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.395467 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp568\" (UniqueName: \"kubernetes.io/projected/12c8fabd-e2f8-4073-af4a-21fde9a45d85-kube-api-access-bp568\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp\" (UID: \"12c8fabd-e2f8-4073-af4a-21fde9a45d85\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.426818 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" Sep 30 14:23:09 crc kubenswrapper[4936]: I0930 14:23:09.978553 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp"] Sep 30 14:23:10 crc kubenswrapper[4936]: I0930 14:23:10.425320 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:23:10 crc kubenswrapper[4936]: I0930 14:23:10.982473 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" event={"ID":"12c8fabd-e2f8-4073-af4a-21fde9a45d85","Type":"ContainerStarted","Data":"e4bec1059d03448b2b134056213b9efe703503e5a0c141b82144cb81b958676d"} Sep 30 14:23:10 crc kubenswrapper[4936]: I0930 14:23:10.982863 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" event={"ID":"12c8fabd-e2f8-4073-af4a-21fde9a45d85","Type":"ContainerStarted","Data":"4d59b488a37730ac74284d1cde7f37e4e00931b4d14f1681f6161f2f73150be1"} Sep 30 14:23:11 crc kubenswrapper[4936]: I0930 14:23:11.005865 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" podStartSLOduration=1.570715445 podStartE2EDuration="2.005845053s" podCreationTimestamp="2025-09-30 14:23:09 +0000 UTC" firstStartedPulling="2025-09-30 14:23:09.987319325 +0000 UTC m=+2640.371321626" lastFinishedPulling="2025-09-30 14:23:10.422448933 +0000 UTC m=+2640.806451234" observedRunningTime="2025-09-30 14:23:11.001068182 +0000 UTC m=+2641.385070493" watchObservedRunningTime="2025-09-30 14:23:11.005845053 +0000 UTC m=+2641.389847354" Sep 30 14:23:16 crc kubenswrapper[4936]: I0930 14:23:16.021841 4936 generic.go:334] "Generic (PLEG): container finished" podID="12c8fabd-e2f8-4073-af4a-21fde9a45d85" containerID="e4bec1059d03448b2b134056213b9efe703503e5a0c141b82144cb81b958676d" exitCode=0 Sep 30 14:23:16 crc kubenswrapper[4936]: I0930 14:23:16.021934 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" event={"ID":"12c8fabd-e2f8-4073-af4a-21fde9a45d85","Type":"ContainerDied","Data":"e4bec1059d03448b2b134056213b9efe703503e5a0c141b82144cb81b958676d"} Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.399567 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-54wb5"] Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.402218 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54wb5" Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.410421 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-54wb5"] Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.449027 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.534682 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33676b0a-1b1f-4b74-b243-33974be8d595-catalog-content\") pod \"community-operators-54wb5\" (UID: \"33676b0a-1b1f-4b74-b243-33974be8d595\") " pod="openshift-marketplace/community-operators-54wb5" Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.534739 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33676b0a-1b1f-4b74-b243-33974be8d595-utilities\") pod \"community-operators-54wb5\" (UID: \"33676b0a-1b1f-4b74-b243-33974be8d595\") " pod="openshift-marketplace/community-operators-54wb5" Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.534969 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tnbk\" (UniqueName: \"kubernetes.io/projected/33676b0a-1b1f-4b74-b243-33974be8d595-kube-api-access-8tnbk\") pod \"community-operators-54wb5\" (UID: \"33676b0a-1b1f-4b74-b243-33974be8d595\") " pod="openshift-marketplace/community-operators-54wb5" Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.636626 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12c8fabd-e2f8-4073-af4a-21fde9a45d85-ssh-key\") pod \"12c8fabd-e2f8-4073-af4a-21fde9a45d85\" (UID: \"12c8fabd-e2f8-4073-af4a-21fde9a45d85\") " Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.636783 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/12c8fabd-e2f8-4073-af4a-21fde9a45d85-ceph\") pod \"12c8fabd-e2f8-4073-af4a-21fde9a45d85\" (UID: \"12c8fabd-e2f8-4073-af4a-21fde9a45d85\") " Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.636905 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp568\" (UniqueName: \"kubernetes.io/projected/12c8fabd-e2f8-4073-af4a-21fde9a45d85-kube-api-access-bp568\") pod \"12c8fabd-e2f8-4073-af4a-21fde9a45d85\" (UID: \"12c8fabd-e2f8-4073-af4a-21fde9a45d85\") " Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.636964 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12c8fabd-e2f8-4073-af4a-21fde9a45d85-inventory\") pod \"12c8fabd-e2f8-4073-af4a-21fde9a45d85\" (UID: \"12c8fabd-e2f8-4073-af4a-21fde9a45d85\") " Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.637162 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33676b0a-1b1f-4b74-b243-33974be8d595-catalog-content\") pod \"community-operators-54wb5\" (UID: \"33676b0a-1b1f-4b74-b243-33974be8d595\") " pod="openshift-marketplace/community-operators-54wb5" Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.637187 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33676b0a-1b1f-4b74-b243-33974be8d595-utilities\") pod \"community-operators-54wb5\" (UID: \"33676b0a-1b1f-4b74-b243-33974be8d595\") " pod="openshift-marketplace/community-operators-54wb5" Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.637245 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tnbk\" (UniqueName: \"kubernetes.io/projected/33676b0a-1b1f-4b74-b243-33974be8d595-kube-api-access-8tnbk\") pod \"community-operators-54wb5\" (UID: \"33676b0a-1b1f-4b74-b243-33974be8d595\") " pod="openshift-marketplace/community-operators-54wb5" Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.637774 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33676b0a-1b1f-4b74-b243-33974be8d595-catalog-content\") pod \"community-operators-54wb5\" (UID: \"33676b0a-1b1f-4b74-b243-33974be8d595\") " pod="openshift-marketplace/community-operators-54wb5" Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.637836 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33676b0a-1b1f-4b74-b243-33974be8d595-utilities\") pod \"community-operators-54wb5\" (UID: \"33676b0a-1b1f-4b74-b243-33974be8d595\") " pod="openshift-marketplace/community-operators-54wb5" Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.641985 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12c8fabd-e2f8-4073-af4a-21fde9a45d85-kube-api-access-bp568" (OuterVolumeSpecName: "kube-api-access-bp568") pod "12c8fabd-e2f8-4073-af4a-21fde9a45d85" (UID: "12c8fabd-e2f8-4073-af4a-21fde9a45d85"). InnerVolumeSpecName "kube-api-access-bp568". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.646293 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12c8fabd-e2f8-4073-af4a-21fde9a45d85-ceph" (OuterVolumeSpecName: "ceph") pod "12c8fabd-e2f8-4073-af4a-21fde9a45d85" (UID: "12c8fabd-e2f8-4073-af4a-21fde9a45d85"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.661827 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tnbk\" (UniqueName: \"kubernetes.io/projected/33676b0a-1b1f-4b74-b243-33974be8d595-kube-api-access-8tnbk\") pod \"community-operators-54wb5\" (UID: \"33676b0a-1b1f-4b74-b243-33974be8d595\") " pod="openshift-marketplace/community-operators-54wb5" Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.667435 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12c8fabd-e2f8-4073-af4a-21fde9a45d85-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "12c8fabd-e2f8-4073-af4a-21fde9a45d85" (UID: "12c8fabd-e2f8-4073-af4a-21fde9a45d85"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.685914 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12c8fabd-e2f8-4073-af4a-21fde9a45d85-inventory" (OuterVolumeSpecName: "inventory") pod "12c8fabd-e2f8-4073-af4a-21fde9a45d85" (UID: "12c8fabd-e2f8-4073-af4a-21fde9a45d85"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.738890 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/12c8fabd-e2f8-4073-af4a-21fde9a45d85-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.738930 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp568\" (UniqueName: \"kubernetes.io/projected/12c8fabd-e2f8-4073-af4a-21fde9a45d85-kube-api-access-bp568\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.738940 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12c8fabd-e2f8-4073-af4a-21fde9a45d85-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.738949 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12c8fabd-e2f8-4073-af4a-21fde9a45d85-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:17 crc kubenswrapper[4936]: I0930 14:23:17.761760 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54wb5" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.045450 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" event={"ID":"12c8fabd-e2f8-4073-af4a-21fde9a45d85","Type":"ContainerDied","Data":"4d59b488a37730ac74284d1cde7f37e4e00931b4d14f1681f6161f2f73150be1"} Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.045496 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d59b488a37730ac74284d1cde7f37e4e00931b4d14f1681f6161f2f73150be1" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.045553 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.146412 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk"] Sep 30 14:23:18 crc kubenswrapper[4936]: E0930 14:23:18.146834 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12c8fabd-e2f8-4073-af4a-21fde9a45d85" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.146852 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c8fabd-e2f8-4073-af4a-21fde9a45d85" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.147043 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="12c8fabd-e2f8-4073-af4a-21fde9a45d85" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.147666 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.156188 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.156232 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.156357 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.156534 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.156549 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.156727 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.163923 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk"] Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.201162 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-54wb5"] Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.247999 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sqxqk\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.248101 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/70583a2b-3a7e-48fb-a59e-32778aee08fb-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sqxqk\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.248140 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znqkb\" (UniqueName: \"kubernetes.io/projected/70583a2b-3a7e-48fb-a59e-32778aee08fb-kube-api-access-znqkb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sqxqk\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.248161 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sqxqk\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.248281 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sqxqk\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.248310 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sqxqk\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.249965 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.250025 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.350223 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sqxqk\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.351439 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/70583a2b-3a7e-48fb-a59e-32778aee08fb-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sqxqk\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.351888 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znqkb\" (UniqueName: \"kubernetes.io/projected/70583a2b-3a7e-48fb-a59e-32778aee08fb-kube-api-access-znqkb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sqxqk\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.351928 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sqxqk\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.352005 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sqxqk\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.352057 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sqxqk\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.352395 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/70583a2b-3a7e-48fb-a59e-32778aee08fb-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sqxqk\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.358323 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sqxqk\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.359361 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sqxqk\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.360420 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sqxqk\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.360558 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sqxqk\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.371292 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znqkb\" (UniqueName: \"kubernetes.io/projected/70583a2b-3a7e-48fb-a59e-32778aee08fb-kube-api-access-znqkb\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sqxqk\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:23:18 crc kubenswrapper[4936]: I0930 14:23:18.501465 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:23:19 crc kubenswrapper[4936]: I0930 14:23:19.055874 4936 generic.go:334] "Generic (PLEG): container finished" podID="33676b0a-1b1f-4b74-b243-33974be8d595" containerID="77abc624c61972c805bf7657d80f40f72a794e1bd94aa6064c570b306f787bca" exitCode=0 Sep 30 14:23:19 crc kubenswrapper[4936]: I0930 14:23:19.056005 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54wb5" event={"ID":"33676b0a-1b1f-4b74-b243-33974be8d595","Type":"ContainerDied","Data":"77abc624c61972c805bf7657d80f40f72a794e1bd94aa6064c570b306f787bca"} Sep 30 14:23:19 crc kubenswrapper[4936]: I0930 14:23:19.056183 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54wb5" event={"ID":"33676b0a-1b1f-4b74-b243-33974be8d595","Type":"ContainerStarted","Data":"bfbf6c41211ab83f98c3f35cd1efd80ba10532d01d14583ebed62ad11a5b881c"} Sep 30 14:23:19 crc kubenswrapper[4936]: I0930 14:23:19.097225 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk"] Sep 30 14:23:19 crc kubenswrapper[4936]: W0930 14:23:19.109220 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70583a2b_3a7e_48fb_a59e_32778aee08fb.slice/crio-2cfb4ed8d3a3f591d118b083fa67e3b7ea7348928c88882f546180f6edc65c53 WatchSource:0}: Error finding container 2cfb4ed8d3a3f591d118b083fa67e3b7ea7348928c88882f546180f6edc65c53: Status 404 returned error can't find the container with id 2cfb4ed8d3a3f591d118b083fa67e3b7ea7348928c88882f546180f6edc65c53 Sep 30 14:23:20 crc kubenswrapper[4936]: I0930 14:23:20.068984 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54wb5" event={"ID":"33676b0a-1b1f-4b74-b243-33974be8d595","Type":"ContainerStarted","Data":"a2170cf41b3e88c67a41920c95266a1add4ccc6e2cbda8a6e16e41d2cb650ebc"} Sep 30 14:23:20 crc kubenswrapper[4936]: I0930 14:23:20.076303 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" event={"ID":"70583a2b-3a7e-48fb-a59e-32778aee08fb","Type":"ContainerStarted","Data":"3b2d8a01f3e14f9bbf271a589e9671d8af489c2375996a48e5eadc9a34b1ca9a"} Sep 30 14:23:20 crc kubenswrapper[4936]: I0930 14:23:20.076367 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" event={"ID":"70583a2b-3a7e-48fb-a59e-32778aee08fb","Type":"ContainerStarted","Data":"2cfb4ed8d3a3f591d118b083fa67e3b7ea7348928c88882f546180f6edc65c53"} Sep 30 14:23:20 crc kubenswrapper[4936]: I0930 14:23:20.119995 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" podStartSLOduration=1.652080526 podStartE2EDuration="2.119977764s" podCreationTimestamp="2025-09-30 14:23:18 +0000 UTC" firstStartedPulling="2025-09-30 14:23:19.112648985 +0000 UTC m=+2649.496651286" lastFinishedPulling="2025-09-30 14:23:19.580546223 +0000 UTC m=+2649.964548524" observedRunningTime="2025-09-30 14:23:20.112966211 +0000 UTC m=+2650.496968512" watchObservedRunningTime="2025-09-30 14:23:20.119977764 +0000 UTC m=+2650.503980055" Sep 30 14:23:22 crc kubenswrapper[4936]: I0930 14:23:22.093068 4936 generic.go:334] "Generic (PLEG): container finished" podID="33676b0a-1b1f-4b74-b243-33974be8d595" containerID="a2170cf41b3e88c67a41920c95266a1add4ccc6e2cbda8a6e16e41d2cb650ebc" exitCode=0 Sep 30 14:23:22 crc kubenswrapper[4936]: I0930 14:23:22.093150 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54wb5" event={"ID":"33676b0a-1b1f-4b74-b243-33974be8d595","Type":"ContainerDied","Data":"a2170cf41b3e88c67a41920c95266a1add4ccc6e2cbda8a6e16e41d2cb650ebc"} Sep 30 14:23:23 crc kubenswrapper[4936]: I0930 14:23:23.117029 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54wb5" event={"ID":"33676b0a-1b1f-4b74-b243-33974be8d595","Type":"ContainerStarted","Data":"9f2c0562846588e94a7bab1c917d4a9a94f6e4fd9d4acbfaa1000bf55c2077fe"} Sep 30 14:23:23 crc kubenswrapper[4936]: I0930 14:23:23.147845 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-54wb5" podStartSLOduration=2.685392771 podStartE2EDuration="6.147822784s" podCreationTimestamp="2025-09-30 14:23:17 +0000 UTC" firstStartedPulling="2025-09-30 14:23:19.058374794 +0000 UTC m=+2649.442377095" lastFinishedPulling="2025-09-30 14:23:22.520804807 +0000 UTC m=+2652.904807108" observedRunningTime="2025-09-30 14:23:23.137701916 +0000 UTC m=+2653.521704237" watchObservedRunningTime="2025-09-30 14:23:23.147822784 +0000 UTC m=+2653.531825095" Sep 30 14:23:27 crc kubenswrapper[4936]: I0930 14:23:27.762420 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-54wb5" Sep 30 14:23:27 crc kubenswrapper[4936]: I0930 14:23:27.763056 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-54wb5" Sep 30 14:23:27 crc kubenswrapper[4936]: I0930 14:23:27.808048 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-54wb5" Sep 30 14:23:28 crc kubenswrapper[4936]: I0930 14:23:28.197902 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-54wb5" Sep 30 14:23:28 crc kubenswrapper[4936]: I0930 14:23:28.241177 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-54wb5"] Sep 30 14:23:30 crc kubenswrapper[4936]: I0930 14:23:30.171984 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-54wb5" podUID="33676b0a-1b1f-4b74-b243-33974be8d595" containerName="registry-server" containerID="cri-o://9f2c0562846588e94a7bab1c917d4a9a94f6e4fd9d4acbfaa1000bf55c2077fe" gracePeriod=2 Sep 30 14:23:30 crc kubenswrapper[4936]: I0930 14:23:30.599191 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54wb5" Sep 30 14:23:30 crc kubenswrapper[4936]: I0930 14:23:30.784716 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33676b0a-1b1f-4b74-b243-33974be8d595-catalog-content\") pod \"33676b0a-1b1f-4b74-b243-33974be8d595\" (UID: \"33676b0a-1b1f-4b74-b243-33974be8d595\") " Sep 30 14:23:30 crc kubenswrapper[4936]: I0930 14:23:30.784806 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33676b0a-1b1f-4b74-b243-33974be8d595-utilities\") pod \"33676b0a-1b1f-4b74-b243-33974be8d595\" (UID: \"33676b0a-1b1f-4b74-b243-33974be8d595\") " Sep 30 14:23:30 crc kubenswrapper[4936]: I0930 14:23:30.784997 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tnbk\" (UniqueName: \"kubernetes.io/projected/33676b0a-1b1f-4b74-b243-33974be8d595-kube-api-access-8tnbk\") pod \"33676b0a-1b1f-4b74-b243-33974be8d595\" (UID: \"33676b0a-1b1f-4b74-b243-33974be8d595\") " Sep 30 14:23:30 crc kubenswrapper[4936]: I0930 14:23:30.785806 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33676b0a-1b1f-4b74-b243-33974be8d595-utilities" (OuterVolumeSpecName: "utilities") pod "33676b0a-1b1f-4b74-b243-33974be8d595" (UID: "33676b0a-1b1f-4b74-b243-33974be8d595"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:23:30 crc kubenswrapper[4936]: I0930 14:23:30.793873 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33676b0a-1b1f-4b74-b243-33974be8d595-kube-api-access-8tnbk" (OuterVolumeSpecName: "kube-api-access-8tnbk") pod "33676b0a-1b1f-4b74-b243-33974be8d595" (UID: "33676b0a-1b1f-4b74-b243-33974be8d595"). InnerVolumeSpecName "kube-api-access-8tnbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:23:30 crc kubenswrapper[4936]: I0930 14:23:30.829900 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33676b0a-1b1f-4b74-b243-33974be8d595-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33676b0a-1b1f-4b74-b243-33974be8d595" (UID: "33676b0a-1b1f-4b74-b243-33974be8d595"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:23:30 crc kubenswrapper[4936]: I0930 14:23:30.887149 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tnbk\" (UniqueName: \"kubernetes.io/projected/33676b0a-1b1f-4b74-b243-33974be8d595-kube-api-access-8tnbk\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:30 crc kubenswrapper[4936]: I0930 14:23:30.887181 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33676b0a-1b1f-4b74-b243-33974be8d595-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:30 crc kubenswrapper[4936]: I0930 14:23:30.887190 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33676b0a-1b1f-4b74-b243-33974be8d595-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:31 crc kubenswrapper[4936]: I0930 14:23:31.183900 4936 generic.go:334] "Generic (PLEG): container finished" podID="33676b0a-1b1f-4b74-b243-33974be8d595" containerID="9f2c0562846588e94a7bab1c917d4a9a94f6e4fd9d4acbfaa1000bf55c2077fe" exitCode=0 Sep 30 14:23:31 crc kubenswrapper[4936]: I0930 14:23:31.183986 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54wb5" Sep 30 14:23:31 crc kubenswrapper[4936]: I0930 14:23:31.184016 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54wb5" event={"ID":"33676b0a-1b1f-4b74-b243-33974be8d595","Type":"ContainerDied","Data":"9f2c0562846588e94a7bab1c917d4a9a94f6e4fd9d4acbfaa1000bf55c2077fe"} Sep 30 14:23:31 crc kubenswrapper[4936]: I0930 14:23:31.185199 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54wb5" event={"ID":"33676b0a-1b1f-4b74-b243-33974be8d595","Type":"ContainerDied","Data":"bfbf6c41211ab83f98c3f35cd1efd80ba10532d01d14583ebed62ad11a5b881c"} Sep 30 14:23:31 crc kubenswrapper[4936]: I0930 14:23:31.185236 4936 scope.go:117] "RemoveContainer" containerID="9f2c0562846588e94a7bab1c917d4a9a94f6e4fd9d4acbfaa1000bf55c2077fe" Sep 30 14:23:31 crc kubenswrapper[4936]: I0930 14:23:31.228014 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-54wb5"] Sep 30 14:23:31 crc kubenswrapper[4936]: I0930 14:23:31.231094 4936 scope.go:117] "RemoveContainer" containerID="a2170cf41b3e88c67a41920c95266a1add4ccc6e2cbda8a6e16e41d2cb650ebc" Sep 30 14:23:31 crc kubenswrapper[4936]: I0930 14:23:31.236643 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-54wb5"] Sep 30 14:23:31 crc kubenswrapper[4936]: I0930 14:23:31.253088 4936 scope.go:117] "RemoveContainer" containerID="77abc624c61972c805bf7657d80f40f72a794e1bd94aa6064c570b306f787bca" Sep 30 14:23:31 crc kubenswrapper[4936]: I0930 14:23:31.285476 4936 scope.go:117] "RemoveContainer" containerID="9f2c0562846588e94a7bab1c917d4a9a94f6e4fd9d4acbfaa1000bf55c2077fe" Sep 30 14:23:31 crc kubenswrapper[4936]: E0930 14:23:31.285893 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f2c0562846588e94a7bab1c917d4a9a94f6e4fd9d4acbfaa1000bf55c2077fe\": container with ID starting with 9f2c0562846588e94a7bab1c917d4a9a94f6e4fd9d4acbfaa1000bf55c2077fe not found: ID does not exist" containerID="9f2c0562846588e94a7bab1c917d4a9a94f6e4fd9d4acbfaa1000bf55c2077fe" Sep 30 14:23:31 crc kubenswrapper[4936]: I0930 14:23:31.285922 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f2c0562846588e94a7bab1c917d4a9a94f6e4fd9d4acbfaa1000bf55c2077fe"} err="failed to get container status \"9f2c0562846588e94a7bab1c917d4a9a94f6e4fd9d4acbfaa1000bf55c2077fe\": rpc error: code = NotFound desc = could not find container \"9f2c0562846588e94a7bab1c917d4a9a94f6e4fd9d4acbfaa1000bf55c2077fe\": container with ID starting with 9f2c0562846588e94a7bab1c917d4a9a94f6e4fd9d4acbfaa1000bf55c2077fe not found: ID does not exist" Sep 30 14:23:31 crc kubenswrapper[4936]: I0930 14:23:31.285943 4936 scope.go:117] "RemoveContainer" containerID="a2170cf41b3e88c67a41920c95266a1add4ccc6e2cbda8a6e16e41d2cb650ebc" Sep 30 14:23:31 crc kubenswrapper[4936]: E0930 14:23:31.286212 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2170cf41b3e88c67a41920c95266a1add4ccc6e2cbda8a6e16e41d2cb650ebc\": container with ID starting with a2170cf41b3e88c67a41920c95266a1add4ccc6e2cbda8a6e16e41d2cb650ebc not found: ID does not exist" containerID="a2170cf41b3e88c67a41920c95266a1add4ccc6e2cbda8a6e16e41d2cb650ebc" Sep 30 14:23:31 crc kubenswrapper[4936]: I0930 14:23:31.286235 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2170cf41b3e88c67a41920c95266a1add4ccc6e2cbda8a6e16e41d2cb650ebc"} err="failed to get container status \"a2170cf41b3e88c67a41920c95266a1add4ccc6e2cbda8a6e16e41d2cb650ebc\": rpc error: code = NotFound desc = could not find container \"a2170cf41b3e88c67a41920c95266a1add4ccc6e2cbda8a6e16e41d2cb650ebc\": container with ID starting with a2170cf41b3e88c67a41920c95266a1add4ccc6e2cbda8a6e16e41d2cb650ebc not found: ID does not exist" Sep 30 14:23:31 crc kubenswrapper[4936]: I0930 14:23:31.286249 4936 scope.go:117] "RemoveContainer" containerID="77abc624c61972c805bf7657d80f40f72a794e1bd94aa6064c570b306f787bca" Sep 30 14:23:31 crc kubenswrapper[4936]: E0930 14:23:31.286556 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77abc624c61972c805bf7657d80f40f72a794e1bd94aa6064c570b306f787bca\": container with ID starting with 77abc624c61972c805bf7657d80f40f72a794e1bd94aa6064c570b306f787bca not found: ID does not exist" containerID="77abc624c61972c805bf7657d80f40f72a794e1bd94aa6064c570b306f787bca" Sep 30 14:23:31 crc kubenswrapper[4936]: I0930 14:23:31.286581 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77abc624c61972c805bf7657d80f40f72a794e1bd94aa6064c570b306f787bca"} err="failed to get container status \"77abc624c61972c805bf7657d80f40f72a794e1bd94aa6064c570b306f787bca\": rpc error: code = NotFound desc = could not find container \"77abc624c61972c805bf7657d80f40f72a794e1bd94aa6064c570b306f787bca\": container with ID starting with 77abc624c61972c805bf7657d80f40f72a794e1bd94aa6064c570b306f787bca not found: ID does not exist" Sep 30 14:23:32 crc kubenswrapper[4936]: I0930 14:23:32.325533 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33676b0a-1b1f-4b74-b243-33974be8d595" path="/var/lib/kubelet/pods/33676b0a-1b1f-4b74-b243-33974be8d595/volumes" Sep 30 14:23:48 crc kubenswrapper[4936]: I0930 14:23:48.249938 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:23:48 crc kubenswrapper[4936]: I0930 14:23:48.250495 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:24:18 crc kubenswrapper[4936]: I0930 14:24:18.251129 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:24:18 crc kubenswrapper[4936]: I0930 14:24:18.252525 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:24:18 crc kubenswrapper[4936]: I0930 14:24:18.252630 4936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 14:24:18 crc kubenswrapper[4936]: I0930 14:24:18.255275 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bb1898fa06f7887a4f4e96d8737c5ec1fabdd689dd373ee13565538229cbbde3"} pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:24:18 crc kubenswrapper[4936]: I0930 14:24:18.255450 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" containerID="cri-o://bb1898fa06f7887a4f4e96d8737c5ec1fabdd689dd373ee13565538229cbbde3" gracePeriod=600 Sep 30 14:24:18 crc kubenswrapper[4936]: I0930 14:24:18.578812 4936 generic.go:334] "Generic (PLEG): container finished" podID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerID="bb1898fa06f7887a4f4e96d8737c5ec1fabdd689dd373ee13565538229cbbde3" exitCode=0 Sep 30 14:24:18 crc kubenswrapper[4936]: I0930 14:24:18.578977 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerDied","Data":"bb1898fa06f7887a4f4e96d8737c5ec1fabdd689dd373ee13565538229cbbde3"} Sep 30 14:24:18 crc kubenswrapper[4936]: I0930 14:24:18.579240 4936 scope.go:117] "RemoveContainer" containerID="393b3434ce5e2ef00a4f88f5aee903c953f22e6c113fff2ebcc213d5f93e9e3a" Sep 30 14:24:19 crc kubenswrapper[4936]: I0930 14:24:19.593701 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d"} Sep 30 14:24:33 crc kubenswrapper[4936]: I0930 14:24:33.708871 4936 generic.go:334] "Generic (PLEG): container finished" podID="70583a2b-3a7e-48fb-a59e-32778aee08fb" containerID="3b2d8a01f3e14f9bbf271a589e9671d8af489c2375996a48e5eadc9a34b1ca9a" exitCode=0 Sep 30 14:24:33 crc kubenswrapper[4936]: I0930 14:24:33.708913 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" event={"ID":"70583a2b-3a7e-48fb-a59e-32778aee08fb","Type":"ContainerDied","Data":"3b2d8a01f3e14f9bbf271a589e9671d8af489c2375996a48e5eadc9a34b1ca9a"} Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.103731 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.167351 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-ceph\") pod \"70583a2b-3a7e-48fb-a59e-32778aee08fb\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.167405 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/70583a2b-3a7e-48fb-a59e-32778aee08fb-ovncontroller-config-0\") pod \"70583a2b-3a7e-48fb-a59e-32778aee08fb\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.167581 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znqkb\" (UniqueName: \"kubernetes.io/projected/70583a2b-3a7e-48fb-a59e-32778aee08fb-kube-api-access-znqkb\") pod \"70583a2b-3a7e-48fb-a59e-32778aee08fb\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.167706 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-inventory\") pod \"70583a2b-3a7e-48fb-a59e-32778aee08fb\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.167746 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-ovn-combined-ca-bundle\") pod \"70583a2b-3a7e-48fb-a59e-32778aee08fb\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.167820 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-ssh-key\") pod \"70583a2b-3a7e-48fb-a59e-32778aee08fb\" (UID: \"70583a2b-3a7e-48fb-a59e-32778aee08fb\") " Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.173639 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70583a2b-3a7e-48fb-a59e-32778aee08fb-kube-api-access-znqkb" (OuterVolumeSpecName: "kube-api-access-znqkb") pod "70583a2b-3a7e-48fb-a59e-32778aee08fb" (UID: "70583a2b-3a7e-48fb-a59e-32778aee08fb"). InnerVolumeSpecName "kube-api-access-znqkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.173787 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "70583a2b-3a7e-48fb-a59e-32778aee08fb" (UID: "70583a2b-3a7e-48fb-a59e-32778aee08fb"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.173852 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-ceph" (OuterVolumeSpecName: "ceph") pod "70583a2b-3a7e-48fb-a59e-32778aee08fb" (UID: "70583a2b-3a7e-48fb-a59e-32778aee08fb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.196965 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70583a2b-3a7e-48fb-a59e-32778aee08fb-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "70583a2b-3a7e-48fb-a59e-32778aee08fb" (UID: "70583a2b-3a7e-48fb-a59e-32778aee08fb"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.198532 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-inventory" (OuterVolumeSpecName: "inventory") pod "70583a2b-3a7e-48fb-a59e-32778aee08fb" (UID: "70583a2b-3a7e-48fb-a59e-32778aee08fb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.199776 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "70583a2b-3a7e-48fb-a59e-32778aee08fb" (UID: "70583a2b-3a7e-48fb-a59e-32778aee08fb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.270035 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.270077 4936 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/70583a2b-3a7e-48fb-a59e-32778aee08fb-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.270093 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znqkb\" (UniqueName: \"kubernetes.io/projected/70583a2b-3a7e-48fb-a59e-32778aee08fb-kube-api-access-znqkb\") on node \"crc\" DevicePath \"\"" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.270108 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.270121 4936 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.270132 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70583a2b-3a7e-48fb-a59e-32778aee08fb-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.727317 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" event={"ID":"70583a2b-3a7e-48fb-a59e-32778aee08fb","Type":"ContainerDied","Data":"2cfb4ed8d3a3f591d118b083fa67e3b7ea7348928c88882f546180f6edc65c53"} Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.727627 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cfb4ed8d3a3f591d118b083fa67e3b7ea7348928c88882f546180f6edc65c53" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.727479 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sqxqk" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.849815 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg"] Sep 30 14:24:35 crc kubenswrapper[4936]: E0930 14:24:35.850462 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33676b0a-1b1f-4b74-b243-33974be8d595" containerName="extract-content" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.850475 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="33676b0a-1b1f-4b74-b243-33974be8d595" containerName="extract-content" Sep 30 14:24:35 crc kubenswrapper[4936]: E0930 14:24:35.850488 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70583a2b-3a7e-48fb-a59e-32778aee08fb" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.850494 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="70583a2b-3a7e-48fb-a59e-32778aee08fb" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 14:24:35 crc kubenswrapper[4936]: E0930 14:24:35.850507 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33676b0a-1b1f-4b74-b243-33974be8d595" containerName="registry-server" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.850514 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="33676b0a-1b1f-4b74-b243-33974be8d595" containerName="registry-server" Sep 30 14:24:35 crc kubenswrapper[4936]: E0930 14:24:35.850525 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33676b0a-1b1f-4b74-b243-33974be8d595" containerName="extract-utilities" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.850531 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="33676b0a-1b1f-4b74-b243-33974be8d595" containerName="extract-utilities" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.850708 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="70583a2b-3a7e-48fb-a59e-32778aee08fb" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.850720 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="33676b0a-1b1f-4b74-b243-33974be8d595" containerName="registry-server" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.851432 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.854264 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.854466 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.854594 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.854825 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.854939 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.855454 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.857152 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.872019 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg"] Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.986741 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.986792 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.986841 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.986886 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.986909 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.986969 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jdjx\" (UniqueName: \"kubernetes.io/projected/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-kube-api-access-6jdjx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:35 crc kubenswrapper[4936]: I0930 14:24:35.986989 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:36 crc kubenswrapper[4936]: I0930 14:24:36.088790 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:36 crc kubenswrapper[4936]: I0930 14:24:36.088849 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:36 crc kubenswrapper[4936]: I0930 14:24:36.088906 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:36 crc kubenswrapper[4936]: I0930 14:24:36.088963 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:36 crc kubenswrapper[4936]: I0930 14:24:36.089001 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:36 crc kubenswrapper[4936]: I0930 14:24:36.089064 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jdjx\" (UniqueName: \"kubernetes.io/projected/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-kube-api-access-6jdjx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:36 crc kubenswrapper[4936]: I0930 14:24:36.089096 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:36 crc kubenswrapper[4936]: I0930 14:24:36.093853 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:36 crc kubenswrapper[4936]: I0930 14:24:36.093853 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:36 crc kubenswrapper[4936]: I0930 14:24:36.094168 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:36 crc kubenswrapper[4936]: I0930 14:24:36.094411 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:36 crc kubenswrapper[4936]: I0930 14:24:36.095361 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:36 crc kubenswrapper[4936]: I0930 14:24:36.095935 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:36 crc kubenswrapper[4936]: I0930 14:24:36.110847 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jdjx\" (UniqueName: \"kubernetes.io/projected/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-kube-api-access-6jdjx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:36 crc kubenswrapper[4936]: I0930 14:24:36.173981 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:24:36 crc kubenswrapper[4936]: I0930 14:24:36.713385 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg"] Sep 30 14:24:36 crc kubenswrapper[4936]: I0930 14:24:36.735325 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" event={"ID":"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c","Type":"ContainerStarted","Data":"68e891b094402df5d4ce98347c22b891c81bceaed1e2c05759c984042c06163d"} Sep 30 14:24:37 crc kubenswrapper[4936]: I0930 14:24:37.744225 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" event={"ID":"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c","Type":"ContainerStarted","Data":"1ec828f00def01f8320682d8867c11e115cbc6708e43c1d59575e1ce4de389e2"} Sep 30 14:24:37 crc kubenswrapper[4936]: I0930 14:24:37.763163 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" podStartSLOduration=2.25650795 podStartE2EDuration="2.763143401s" podCreationTimestamp="2025-09-30 14:24:35 +0000 UTC" firstStartedPulling="2025-09-30 14:24:36.72699879 +0000 UTC m=+2727.111001091" lastFinishedPulling="2025-09-30 14:24:37.233634241 +0000 UTC m=+2727.617636542" observedRunningTime="2025-09-30 14:24:37.758072622 +0000 UTC m=+2728.142074933" watchObservedRunningTime="2025-09-30 14:24:37.763143401 +0000 UTC m=+2728.147145702" Sep 30 14:25:20 crc kubenswrapper[4936]: I0930 14:25:20.470381 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8nqfs"] Sep 30 14:25:20 crc kubenswrapper[4936]: I0930 14:25:20.473631 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nqfs" Sep 30 14:25:20 crc kubenswrapper[4936]: I0930 14:25:20.481685 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8nqfs"] Sep 30 14:25:20 crc kubenswrapper[4936]: I0930 14:25:20.555081 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6hw2\" (UniqueName: \"kubernetes.io/projected/899879d4-093e-42d5-adbe-d5bcce07961c-kube-api-access-k6hw2\") pod \"certified-operators-8nqfs\" (UID: \"899879d4-093e-42d5-adbe-d5bcce07961c\") " pod="openshift-marketplace/certified-operators-8nqfs" Sep 30 14:25:20 crc kubenswrapper[4936]: I0930 14:25:20.555141 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/899879d4-093e-42d5-adbe-d5bcce07961c-catalog-content\") pod \"certified-operators-8nqfs\" (UID: \"899879d4-093e-42d5-adbe-d5bcce07961c\") " pod="openshift-marketplace/certified-operators-8nqfs" Sep 30 14:25:20 crc kubenswrapper[4936]: I0930 14:25:20.555244 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/899879d4-093e-42d5-adbe-d5bcce07961c-utilities\") pod \"certified-operators-8nqfs\" (UID: \"899879d4-093e-42d5-adbe-d5bcce07961c\") " pod="openshift-marketplace/certified-operators-8nqfs" Sep 30 14:25:20 crc kubenswrapper[4936]: I0930 14:25:20.657684 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/899879d4-093e-42d5-adbe-d5bcce07961c-utilities\") pod \"certified-operators-8nqfs\" (UID: \"899879d4-093e-42d5-adbe-d5bcce07961c\") " pod="openshift-marketplace/certified-operators-8nqfs" Sep 30 14:25:20 crc kubenswrapper[4936]: I0930 14:25:20.657861 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6hw2\" (UniqueName: \"kubernetes.io/projected/899879d4-093e-42d5-adbe-d5bcce07961c-kube-api-access-k6hw2\") pod \"certified-operators-8nqfs\" (UID: \"899879d4-093e-42d5-adbe-d5bcce07961c\") " pod="openshift-marketplace/certified-operators-8nqfs" Sep 30 14:25:20 crc kubenswrapper[4936]: I0930 14:25:20.657893 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/899879d4-093e-42d5-adbe-d5bcce07961c-catalog-content\") pod \"certified-operators-8nqfs\" (UID: \"899879d4-093e-42d5-adbe-d5bcce07961c\") " pod="openshift-marketplace/certified-operators-8nqfs" Sep 30 14:25:20 crc kubenswrapper[4936]: I0930 14:25:20.658702 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/899879d4-093e-42d5-adbe-d5bcce07961c-utilities\") pod \"certified-operators-8nqfs\" (UID: \"899879d4-093e-42d5-adbe-d5bcce07961c\") " pod="openshift-marketplace/certified-operators-8nqfs" Sep 30 14:25:20 crc kubenswrapper[4936]: I0930 14:25:20.658714 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/899879d4-093e-42d5-adbe-d5bcce07961c-catalog-content\") pod \"certified-operators-8nqfs\" (UID: \"899879d4-093e-42d5-adbe-d5bcce07961c\") " pod="openshift-marketplace/certified-operators-8nqfs" Sep 30 14:25:20 crc kubenswrapper[4936]: I0930 14:25:20.677696 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6hw2\" (UniqueName: \"kubernetes.io/projected/899879d4-093e-42d5-adbe-d5bcce07961c-kube-api-access-k6hw2\") pod \"certified-operators-8nqfs\" (UID: \"899879d4-093e-42d5-adbe-d5bcce07961c\") " pod="openshift-marketplace/certified-operators-8nqfs" Sep 30 14:25:20 crc kubenswrapper[4936]: I0930 14:25:20.815328 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nqfs" Sep 30 14:25:21 crc kubenswrapper[4936]: I0930 14:25:21.374408 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8nqfs"] Sep 30 14:25:22 crc kubenswrapper[4936]: I0930 14:25:22.111123 4936 generic.go:334] "Generic (PLEG): container finished" podID="899879d4-093e-42d5-adbe-d5bcce07961c" containerID="b0daf9bfbabb7b60aab668c0c6c7dc5c71579c6f5e2515317ab41256f3c2c962" exitCode=0 Sep 30 14:25:22 crc kubenswrapper[4936]: I0930 14:25:22.111186 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nqfs" event={"ID":"899879d4-093e-42d5-adbe-d5bcce07961c","Type":"ContainerDied","Data":"b0daf9bfbabb7b60aab668c0c6c7dc5c71579c6f5e2515317ab41256f3c2c962"} Sep 30 14:25:22 crc kubenswrapper[4936]: I0930 14:25:22.111216 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nqfs" event={"ID":"899879d4-093e-42d5-adbe-d5bcce07961c","Type":"ContainerStarted","Data":"5d9b08f500cb9370f1e678a158bb7262f98d64bab42caa4f4beb01ed59eff5d6"} Sep 30 14:25:22 crc kubenswrapper[4936]: I0930 14:25:22.113596 4936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:25:24 crc kubenswrapper[4936]: I0930 14:25:24.129027 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nqfs" event={"ID":"899879d4-093e-42d5-adbe-d5bcce07961c","Type":"ContainerStarted","Data":"35f9b88c1bb364c211748559913d58ef2601bc425f7dd08a241c176c51c36fec"} Sep 30 14:25:25 crc kubenswrapper[4936]: I0930 14:25:25.137818 4936 generic.go:334] "Generic (PLEG): container finished" podID="899879d4-093e-42d5-adbe-d5bcce07961c" containerID="35f9b88c1bb364c211748559913d58ef2601bc425f7dd08a241c176c51c36fec" exitCode=0 Sep 30 14:25:25 crc kubenswrapper[4936]: I0930 14:25:25.137995 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nqfs" event={"ID":"899879d4-093e-42d5-adbe-d5bcce07961c","Type":"ContainerDied","Data":"35f9b88c1bb364c211748559913d58ef2601bc425f7dd08a241c176c51c36fec"} Sep 30 14:25:26 crc kubenswrapper[4936]: I0930 14:25:26.151894 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nqfs" event={"ID":"899879d4-093e-42d5-adbe-d5bcce07961c","Type":"ContainerStarted","Data":"2c69dae458154284b9deb996f5e5f39ca06880c6a672be842c8fe8435c25d9f3"} Sep 30 14:25:28 crc kubenswrapper[4936]: I0930 14:25:28.618269 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8nqfs" podStartSLOduration=5.095057023 podStartE2EDuration="8.618252775s" podCreationTimestamp="2025-09-30 14:25:20 +0000 UTC" firstStartedPulling="2025-09-30 14:25:22.113259447 +0000 UTC m=+2772.497261748" lastFinishedPulling="2025-09-30 14:25:25.636455209 +0000 UTC m=+2776.020457500" observedRunningTime="2025-09-30 14:25:26.178706378 +0000 UTC m=+2776.562708679" watchObservedRunningTime="2025-09-30 14:25:28.618252775 +0000 UTC m=+2779.002255076" Sep 30 14:25:28 crc kubenswrapper[4936]: I0930 14:25:28.630445 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tq4p2"] Sep 30 14:25:28 crc kubenswrapper[4936]: I0930 14:25:28.633041 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tq4p2" Sep 30 14:25:28 crc kubenswrapper[4936]: I0930 14:25:28.642202 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tq4p2"] Sep 30 14:25:28 crc kubenswrapper[4936]: I0930 14:25:28.800864 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6pmq\" (UniqueName: \"kubernetes.io/projected/a7136dfa-d7f6-4aab-916a-52055fe67f4c-kube-api-access-f6pmq\") pod \"redhat-operators-tq4p2\" (UID: \"a7136dfa-d7f6-4aab-916a-52055fe67f4c\") " pod="openshift-marketplace/redhat-operators-tq4p2" Sep 30 14:25:28 crc kubenswrapper[4936]: I0930 14:25:28.801173 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7136dfa-d7f6-4aab-916a-52055fe67f4c-catalog-content\") pod \"redhat-operators-tq4p2\" (UID: \"a7136dfa-d7f6-4aab-916a-52055fe67f4c\") " pod="openshift-marketplace/redhat-operators-tq4p2" Sep 30 14:25:28 crc kubenswrapper[4936]: I0930 14:25:28.801322 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7136dfa-d7f6-4aab-916a-52055fe67f4c-utilities\") pod \"redhat-operators-tq4p2\" (UID: \"a7136dfa-d7f6-4aab-916a-52055fe67f4c\") " pod="openshift-marketplace/redhat-operators-tq4p2" Sep 30 14:25:28 crc kubenswrapper[4936]: I0930 14:25:28.902755 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7136dfa-d7f6-4aab-916a-52055fe67f4c-utilities\") pod \"redhat-operators-tq4p2\" (UID: \"a7136dfa-d7f6-4aab-916a-52055fe67f4c\") " pod="openshift-marketplace/redhat-operators-tq4p2" Sep 30 14:25:28 crc kubenswrapper[4936]: I0930 14:25:28.903053 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6pmq\" (UniqueName: \"kubernetes.io/projected/a7136dfa-d7f6-4aab-916a-52055fe67f4c-kube-api-access-f6pmq\") pod \"redhat-operators-tq4p2\" (UID: \"a7136dfa-d7f6-4aab-916a-52055fe67f4c\") " pod="openshift-marketplace/redhat-operators-tq4p2" Sep 30 14:25:28 crc kubenswrapper[4936]: I0930 14:25:28.903121 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7136dfa-d7f6-4aab-916a-52055fe67f4c-catalog-content\") pod \"redhat-operators-tq4p2\" (UID: \"a7136dfa-d7f6-4aab-916a-52055fe67f4c\") " pod="openshift-marketplace/redhat-operators-tq4p2" Sep 30 14:25:28 crc kubenswrapper[4936]: I0930 14:25:28.903563 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7136dfa-d7f6-4aab-916a-52055fe67f4c-catalog-content\") pod \"redhat-operators-tq4p2\" (UID: \"a7136dfa-d7f6-4aab-916a-52055fe67f4c\") " pod="openshift-marketplace/redhat-operators-tq4p2" Sep 30 14:25:28 crc kubenswrapper[4936]: I0930 14:25:28.903773 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7136dfa-d7f6-4aab-916a-52055fe67f4c-utilities\") pod \"redhat-operators-tq4p2\" (UID: \"a7136dfa-d7f6-4aab-916a-52055fe67f4c\") " pod="openshift-marketplace/redhat-operators-tq4p2" Sep 30 14:25:28 crc kubenswrapper[4936]: I0930 14:25:28.924150 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6pmq\" (UniqueName: \"kubernetes.io/projected/a7136dfa-d7f6-4aab-916a-52055fe67f4c-kube-api-access-f6pmq\") pod \"redhat-operators-tq4p2\" (UID: \"a7136dfa-d7f6-4aab-916a-52055fe67f4c\") " pod="openshift-marketplace/redhat-operators-tq4p2" Sep 30 14:25:28 crc kubenswrapper[4936]: I0930 14:25:28.975012 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tq4p2" Sep 30 14:25:29 crc kubenswrapper[4936]: I0930 14:25:29.466272 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tq4p2"] Sep 30 14:25:29 crc kubenswrapper[4936]: W0930 14:25:29.475013 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7136dfa_d7f6_4aab_916a_52055fe67f4c.slice/crio-3b0a231fd2c8095fab4a564e47a2df44df27c828c920a4add24b1692591c90cf WatchSource:0}: Error finding container 3b0a231fd2c8095fab4a564e47a2df44df27c828c920a4add24b1692591c90cf: Status 404 returned error can't find the container with id 3b0a231fd2c8095fab4a564e47a2df44df27c828c920a4add24b1692591c90cf Sep 30 14:25:30 crc kubenswrapper[4936]: I0930 14:25:30.185803 4936 generic.go:334] "Generic (PLEG): container finished" podID="a7136dfa-d7f6-4aab-916a-52055fe67f4c" containerID="457fa289c8b80ee063574a7e97858e6f7341f9a95619ed28ce237b7e00907f3f" exitCode=0 Sep 30 14:25:30 crc kubenswrapper[4936]: I0930 14:25:30.185854 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tq4p2" event={"ID":"a7136dfa-d7f6-4aab-916a-52055fe67f4c","Type":"ContainerDied","Data":"457fa289c8b80ee063574a7e97858e6f7341f9a95619ed28ce237b7e00907f3f"} Sep 30 14:25:30 crc kubenswrapper[4936]: I0930 14:25:30.186140 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tq4p2" event={"ID":"a7136dfa-d7f6-4aab-916a-52055fe67f4c","Type":"ContainerStarted","Data":"3b0a231fd2c8095fab4a564e47a2df44df27c828c920a4add24b1692591c90cf"} Sep 30 14:25:30 crc kubenswrapper[4936]: I0930 14:25:30.815692 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8nqfs" Sep 30 14:25:30 crc kubenswrapper[4936]: I0930 14:25:30.816087 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8nqfs" Sep 30 14:25:30 crc kubenswrapper[4936]: I0930 14:25:30.868176 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8nqfs" Sep 30 14:25:31 crc kubenswrapper[4936]: I0930 14:25:31.242926 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8nqfs" Sep 30 14:25:32 crc kubenswrapper[4936]: I0930 14:25:32.206076 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tq4p2" event={"ID":"a7136dfa-d7f6-4aab-916a-52055fe67f4c","Type":"ContainerStarted","Data":"f8352510e5628d5a6daa41551646967d03b8a4ff775819f2d3bde955f258b263"} Sep 30 14:25:33 crc kubenswrapper[4936]: I0930 14:25:33.817401 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8nqfs"] Sep 30 14:25:33 crc kubenswrapper[4936]: I0930 14:25:33.817951 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8nqfs" podUID="899879d4-093e-42d5-adbe-d5bcce07961c" containerName="registry-server" containerID="cri-o://2c69dae458154284b9deb996f5e5f39ca06880c6a672be842c8fe8435c25d9f3" gracePeriod=2 Sep 30 14:25:34 crc kubenswrapper[4936]: I0930 14:25:34.223947 4936 generic.go:334] "Generic (PLEG): container finished" podID="899879d4-093e-42d5-adbe-d5bcce07961c" containerID="2c69dae458154284b9deb996f5e5f39ca06880c6a672be842c8fe8435c25d9f3" exitCode=0 Sep 30 14:25:34 crc kubenswrapper[4936]: I0930 14:25:34.224002 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nqfs" event={"ID":"899879d4-093e-42d5-adbe-d5bcce07961c","Type":"ContainerDied","Data":"2c69dae458154284b9deb996f5e5f39ca06880c6a672be842c8fe8435c25d9f3"} Sep 30 14:25:34 crc kubenswrapper[4936]: I0930 14:25:34.224240 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nqfs" event={"ID":"899879d4-093e-42d5-adbe-d5bcce07961c","Type":"ContainerDied","Data":"5d9b08f500cb9370f1e678a158bb7262f98d64bab42caa4f4beb01ed59eff5d6"} Sep 30 14:25:34 crc kubenswrapper[4936]: I0930 14:25:34.224253 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d9b08f500cb9370f1e678a158bb7262f98d64bab42caa4f4beb01ed59eff5d6" Sep 30 14:25:34 crc kubenswrapper[4936]: I0930 14:25:34.262143 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nqfs" Sep 30 14:25:34 crc kubenswrapper[4936]: I0930 14:25:34.414251 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/899879d4-093e-42d5-adbe-d5bcce07961c-catalog-content\") pod \"899879d4-093e-42d5-adbe-d5bcce07961c\" (UID: \"899879d4-093e-42d5-adbe-d5bcce07961c\") " Sep 30 14:25:34 crc kubenswrapper[4936]: I0930 14:25:34.414344 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/899879d4-093e-42d5-adbe-d5bcce07961c-utilities\") pod \"899879d4-093e-42d5-adbe-d5bcce07961c\" (UID: \"899879d4-093e-42d5-adbe-d5bcce07961c\") " Sep 30 14:25:34 crc kubenswrapper[4936]: I0930 14:25:34.414757 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6hw2\" (UniqueName: \"kubernetes.io/projected/899879d4-093e-42d5-adbe-d5bcce07961c-kube-api-access-k6hw2\") pod \"899879d4-093e-42d5-adbe-d5bcce07961c\" (UID: \"899879d4-093e-42d5-adbe-d5bcce07961c\") " Sep 30 14:25:34 crc kubenswrapper[4936]: I0930 14:25:34.416839 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/899879d4-093e-42d5-adbe-d5bcce07961c-utilities" (OuterVolumeSpecName: "utilities") pod "899879d4-093e-42d5-adbe-d5bcce07961c" (UID: "899879d4-093e-42d5-adbe-d5bcce07961c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:25:34 crc kubenswrapper[4936]: I0930 14:25:34.427633 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/899879d4-093e-42d5-adbe-d5bcce07961c-kube-api-access-k6hw2" (OuterVolumeSpecName: "kube-api-access-k6hw2") pod "899879d4-093e-42d5-adbe-d5bcce07961c" (UID: "899879d4-093e-42d5-adbe-d5bcce07961c"). InnerVolumeSpecName "kube-api-access-k6hw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:25:34 crc kubenswrapper[4936]: I0930 14:25:34.472816 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/899879d4-093e-42d5-adbe-d5bcce07961c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "899879d4-093e-42d5-adbe-d5bcce07961c" (UID: "899879d4-093e-42d5-adbe-d5bcce07961c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:25:34 crc kubenswrapper[4936]: I0930 14:25:34.517800 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/899879d4-093e-42d5-adbe-d5bcce07961c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:34 crc kubenswrapper[4936]: I0930 14:25:34.517831 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/899879d4-093e-42d5-adbe-d5bcce07961c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:34 crc kubenswrapper[4936]: I0930 14:25:34.517841 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6hw2\" (UniqueName: \"kubernetes.io/projected/899879d4-093e-42d5-adbe-d5bcce07961c-kube-api-access-k6hw2\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:35 crc kubenswrapper[4936]: I0930 14:25:35.255142 4936 generic.go:334] "Generic (PLEG): container finished" podID="a7136dfa-d7f6-4aab-916a-52055fe67f4c" containerID="f8352510e5628d5a6daa41551646967d03b8a4ff775819f2d3bde955f258b263" exitCode=0 Sep 30 14:25:35 crc kubenswrapper[4936]: I0930 14:25:35.255191 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tq4p2" event={"ID":"a7136dfa-d7f6-4aab-916a-52055fe67f4c","Type":"ContainerDied","Data":"f8352510e5628d5a6daa41551646967d03b8a4ff775819f2d3bde955f258b263"} Sep 30 14:25:35 crc kubenswrapper[4936]: I0930 14:25:35.255289 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nqfs" Sep 30 14:25:35 crc kubenswrapper[4936]: I0930 14:25:35.314626 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8nqfs"] Sep 30 14:25:35 crc kubenswrapper[4936]: I0930 14:25:35.321105 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8nqfs"] Sep 30 14:25:36 crc kubenswrapper[4936]: I0930 14:25:36.274855 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tq4p2" event={"ID":"a7136dfa-d7f6-4aab-916a-52055fe67f4c","Type":"ContainerStarted","Data":"c99f123d692a8034d3e4c462923718603d75739d7841ee3b04001e43eda6e4a2"} Sep 30 14:25:36 crc kubenswrapper[4936]: I0930 14:25:36.307425 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tq4p2" podStartSLOduration=2.545907015 podStartE2EDuration="8.307398971s" podCreationTimestamp="2025-09-30 14:25:28 +0000 UTC" firstStartedPulling="2025-09-30 14:25:30.189501689 +0000 UTC m=+2780.573503990" lastFinishedPulling="2025-09-30 14:25:35.950993645 +0000 UTC m=+2786.334995946" observedRunningTime="2025-09-30 14:25:36.29826246 +0000 UTC m=+2786.682264781" watchObservedRunningTime="2025-09-30 14:25:36.307398971 +0000 UTC m=+2786.691401272" Sep 30 14:25:36 crc kubenswrapper[4936]: I0930 14:25:36.328240 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="899879d4-093e-42d5-adbe-d5bcce07961c" path="/var/lib/kubelet/pods/899879d4-093e-42d5-adbe-d5bcce07961c/volumes" Sep 30 14:25:38 crc kubenswrapper[4936]: I0930 14:25:38.292485 4936 generic.go:334] "Generic (PLEG): container finished" podID="bf6cd6ff-b9ca-4f66-8978-0394e03fe76c" containerID="1ec828f00def01f8320682d8867c11e115cbc6708e43c1d59575e1ce4de389e2" exitCode=0 Sep 30 14:25:38 crc kubenswrapper[4936]: I0930 14:25:38.292580 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" event={"ID":"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c","Type":"ContainerDied","Data":"1ec828f00def01f8320682d8867c11e115cbc6708e43c1d59575e1ce4de389e2"} Sep 30 14:25:38 crc kubenswrapper[4936]: I0930 14:25:38.975359 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tq4p2" Sep 30 14:25:38 crc kubenswrapper[4936]: I0930 14:25:38.976376 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tq4p2" Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.689387 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.829917 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-ceph\") pod \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.830303 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-neutron-metadata-combined-ca-bundle\") pod \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.830420 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-nova-metadata-neutron-config-0\") pod \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.830479 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.830545 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-ssh-key\") pod \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.830590 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jdjx\" (UniqueName: \"kubernetes.io/projected/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-kube-api-access-6jdjx\") pod \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.830660 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-inventory\") pod \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\" (UID: \"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c\") " Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.835979 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "bf6cd6ff-b9ca-4f66-8978-0394e03fe76c" (UID: "bf6cd6ff-b9ca-4f66-8978-0394e03fe76c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.837413 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-ceph" (OuterVolumeSpecName: "ceph") pod "bf6cd6ff-b9ca-4f66-8978-0394e03fe76c" (UID: "bf6cd6ff-b9ca-4f66-8978-0394e03fe76c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.840880 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-kube-api-access-6jdjx" (OuterVolumeSpecName: "kube-api-access-6jdjx") pod "bf6cd6ff-b9ca-4f66-8978-0394e03fe76c" (UID: "bf6cd6ff-b9ca-4f66-8978-0394e03fe76c"). InnerVolumeSpecName "kube-api-access-6jdjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.862478 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-inventory" (OuterVolumeSpecName: "inventory") pod "bf6cd6ff-b9ca-4f66-8978-0394e03fe76c" (UID: "bf6cd6ff-b9ca-4f66-8978-0394e03fe76c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.865519 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "bf6cd6ff-b9ca-4f66-8978-0394e03fe76c" (UID: "bf6cd6ff-b9ca-4f66-8978-0394e03fe76c"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.867607 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "bf6cd6ff-b9ca-4f66-8978-0394e03fe76c" (UID: "bf6cd6ff-b9ca-4f66-8978-0394e03fe76c"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.872756 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bf6cd6ff-b9ca-4f66-8978-0394e03fe76c" (UID: "bf6cd6ff-b9ca-4f66-8978-0394e03fe76c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.932961 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jdjx\" (UniqueName: \"kubernetes.io/projected/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-kube-api-access-6jdjx\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.933037 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.933047 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.933082 4936 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.933130 4936 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.933140 4936 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:39 crc kubenswrapper[4936]: I0930 14:25:39.933150 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bf6cd6ff-b9ca-4f66-8978-0394e03fe76c-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.019116 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tq4p2" podUID="a7136dfa-d7f6-4aab-916a-52055fe67f4c" containerName="registry-server" probeResult="failure" output=< Sep 30 14:25:40 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 14:25:40 crc kubenswrapper[4936]: > Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.307022 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" event={"ID":"bf6cd6ff-b9ca-4f66-8978-0394e03fe76c","Type":"ContainerDied","Data":"68e891b094402df5d4ce98347c22b891c81bceaed1e2c05759c984042c06163d"} Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.307061 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68e891b094402df5d4ce98347c22b891c81bceaed1e2c05759c984042c06163d" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.307093 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.406063 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4"] Sep 30 14:25:40 crc kubenswrapper[4936]: E0930 14:25:40.406427 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf6cd6ff-b9ca-4f66-8978-0394e03fe76c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.406444 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf6cd6ff-b9ca-4f66-8978-0394e03fe76c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 14:25:40 crc kubenswrapper[4936]: E0930 14:25:40.406461 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899879d4-093e-42d5-adbe-d5bcce07961c" containerName="extract-utilities" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.406468 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="899879d4-093e-42d5-adbe-d5bcce07961c" containerName="extract-utilities" Sep 30 14:25:40 crc kubenswrapper[4936]: E0930 14:25:40.406481 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899879d4-093e-42d5-adbe-d5bcce07961c" containerName="extract-content" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.406487 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="899879d4-093e-42d5-adbe-d5bcce07961c" containerName="extract-content" Sep 30 14:25:40 crc kubenswrapper[4936]: E0930 14:25:40.406513 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899879d4-093e-42d5-adbe-d5bcce07961c" containerName="registry-server" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.406519 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="899879d4-093e-42d5-adbe-d5bcce07961c" containerName="registry-server" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.406670 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="899879d4-093e-42d5-adbe-d5bcce07961c" containerName="registry-server" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.406680 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf6cd6ff-b9ca-4f66-8978-0394e03fe76c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.407263 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.411034 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.412769 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.413589 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.413797 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.413921 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.414048 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.428661 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4"] Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.543762 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.544067 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.544209 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm5lh\" (UniqueName: \"kubernetes.io/projected/acb77378-b2f6-48a5-b156-0c983ebde855-kube-api-access-rm5lh\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.544316 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.544438 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.544515 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.646108 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.646167 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.646232 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm5lh\" (UniqueName: \"kubernetes.io/projected/acb77378-b2f6-48a5-b156-0c983ebde855-kube-api-access-rm5lh\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.646252 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.646278 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.646295 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.649600 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.649627 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.652187 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.653559 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.659004 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.668748 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm5lh\" (UniqueName: \"kubernetes.io/projected/acb77378-b2f6-48a5-b156-0c983ebde855-kube-api-access-rm5lh\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:25:40 crc kubenswrapper[4936]: I0930 14:25:40.722988 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:25:41 crc kubenswrapper[4936]: I0930 14:25:41.271039 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4"] Sep 30 14:25:41 crc kubenswrapper[4936]: I0930 14:25:41.319799 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" event={"ID":"acb77378-b2f6-48a5-b156-0c983ebde855","Type":"ContainerStarted","Data":"6520928ca382d1fa8c7f31c7a4a56bb9ef5c1f75e1773e7d4bee5b61d704fa81"} Sep 30 14:25:42 crc kubenswrapper[4936]: I0930 14:25:42.333654 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" event={"ID":"acb77378-b2f6-48a5-b156-0c983ebde855","Type":"ContainerStarted","Data":"57afe8e397965dd447ec214cc4ccaa01a1697380664736d95a8f006eb79a2a0d"} Sep 30 14:25:42 crc kubenswrapper[4936]: I0930 14:25:42.375952 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" podStartSLOduration=1.8106630639999999 podStartE2EDuration="2.375931114s" podCreationTimestamp="2025-09-30 14:25:40 +0000 UTC" firstStartedPulling="2025-09-30 14:25:41.281652963 +0000 UTC m=+2791.665655264" lastFinishedPulling="2025-09-30 14:25:41.846921013 +0000 UTC m=+2792.230923314" observedRunningTime="2025-09-30 14:25:42.367901023 +0000 UTC m=+2792.751903324" watchObservedRunningTime="2025-09-30 14:25:42.375931114 +0000 UTC m=+2792.759933415" Sep 30 14:25:49 crc kubenswrapper[4936]: I0930 14:25:49.035751 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tq4p2" Sep 30 14:25:49 crc kubenswrapper[4936]: I0930 14:25:49.107587 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tq4p2" Sep 30 14:25:49 crc kubenswrapper[4936]: I0930 14:25:49.274496 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tq4p2"] Sep 30 14:25:50 crc kubenswrapper[4936]: I0930 14:25:50.393609 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tq4p2" podUID="a7136dfa-d7f6-4aab-916a-52055fe67f4c" containerName="registry-server" containerID="cri-o://c99f123d692a8034d3e4c462923718603d75739d7841ee3b04001e43eda6e4a2" gracePeriod=2 Sep 30 14:25:50 crc kubenswrapper[4936]: I0930 14:25:50.826370 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tq4p2" Sep 30 14:25:50 crc kubenswrapper[4936]: I0930 14:25:50.936745 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7136dfa-d7f6-4aab-916a-52055fe67f4c-utilities\") pod \"a7136dfa-d7f6-4aab-916a-52055fe67f4c\" (UID: \"a7136dfa-d7f6-4aab-916a-52055fe67f4c\") " Sep 30 14:25:50 crc kubenswrapper[4936]: I0930 14:25:50.938380 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7136dfa-d7f6-4aab-916a-52055fe67f4c-utilities" (OuterVolumeSpecName: "utilities") pod "a7136dfa-d7f6-4aab-916a-52055fe67f4c" (UID: "a7136dfa-d7f6-4aab-916a-52055fe67f4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:25:50 crc kubenswrapper[4936]: I0930 14:25:50.938745 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6pmq\" (UniqueName: \"kubernetes.io/projected/a7136dfa-d7f6-4aab-916a-52055fe67f4c-kube-api-access-f6pmq\") pod \"a7136dfa-d7f6-4aab-916a-52055fe67f4c\" (UID: \"a7136dfa-d7f6-4aab-916a-52055fe67f4c\") " Sep 30 14:25:50 crc kubenswrapper[4936]: I0930 14:25:50.939195 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7136dfa-d7f6-4aab-916a-52055fe67f4c-catalog-content\") pod \"a7136dfa-d7f6-4aab-916a-52055fe67f4c\" (UID: \"a7136dfa-d7f6-4aab-916a-52055fe67f4c\") " Sep 30 14:25:50 crc kubenswrapper[4936]: I0930 14:25:50.940374 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7136dfa-d7f6-4aab-916a-52055fe67f4c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:50 crc kubenswrapper[4936]: I0930 14:25:50.945541 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7136dfa-d7f6-4aab-916a-52055fe67f4c-kube-api-access-f6pmq" (OuterVolumeSpecName: "kube-api-access-f6pmq") pod "a7136dfa-d7f6-4aab-916a-52055fe67f4c" (UID: "a7136dfa-d7f6-4aab-916a-52055fe67f4c"). InnerVolumeSpecName "kube-api-access-f6pmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:25:51 crc kubenswrapper[4936]: I0930 14:25:51.026938 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7136dfa-d7f6-4aab-916a-52055fe67f4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7136dfa-d7f6-4aab-916a-52055fe67f4c" (UID: "a7136dfa-d7f6-4aab-916a-52055fe67f4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:25:51 crc kubenswrapper[4936]: I0930 14:25:51.044964 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6pmq\" (UniqueName: \"kubernetes.io/projected/a7136dfa-d7f6-4aab-916a-52055fe67f4c-kube-api-access-f6pmq\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:51 crc kubenswrapper[4936]: I0930 14:25:51.044992 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7136dfa-d7f6-4aab-916a-52055fe67f4c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:51 crc kubenswrapper[4936]: I0930 14:25:51.405745 4936 generic.go:334] "Generic (PLEG): container finished" podID="a7136dfa-d7f6-4aab-916a-52055fe67f4c" containerID="c99f123d692a8034d3e4c462923718603d75739d7841ee3b04001e43eda6e4a2" exitCode=0 Sep 30 14:25:51 crc kubenswrapper[4936]: I0930 14:25:51.405833 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tq4p2" Sep 30 14:25:51 crc kubenswrapper[4936]: I0930 14:25:51.405816 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tq4p2" event={"ID":"a7136dfa-d7f6-4aab-916a-52055fe67f4c","Type":"ContainerDied","Data":"c99f123d692a8034d3e4c462923718603d75739d7841ee3b04001e43eda6e4a2"} Sep 30 14:25:51 crc kubenswrapper[4936]: I0930 14:25:51.405890 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tq4p2" event={"ID":"a7136dfa-d7f6-4aab-916a-52055fe67f4c","Type":"ContainerDied","Data":"3b0a231fd2c8095fab4a564e47a2df44df27c828c920a4add24b1692591c90cf"} Sep 30 14:25:51 crc kubenswrapper[4936]: I0930 14:25:51.405942 4936 scope.go:117] "RemoveContainer" containerID="c99f123d692a8034d3e4c462923718603d75739d7841ee3b04001e43eda6e4a2" Sep 30 14:25:51 crc kubenswrapper[4936]: I0930 14:25:51.442592 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tq4p2"] Sep 30 14:25:51 crc kubenswrapper[4936]: I0930 14:25:51.456785 4936 scope.go:117] "RemoveContainer" containerID="f8352510e5628d5a6daa41551646967d03b8a4ff775819f2d3bde955f258b263" Sep 30 14:25:51 crc kubenswrapper[4936]: I0930 14:25:51.462922 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tq4p2"] Sep 30 14:25:51 crc kubenswrapper[4936]: I0930 14:25:51.492142 4936 scope.go:117] "RemoveContainer" containerID="457fa289c8b80ee063574a7e97858e6f7341f9a95619ed28ce237b7e00907f3f" Sep 30 14:25:51 crc kubenswrapper[4936]: I0930 14:25:51.543421 4936 scope.go:117] "RemoveContainer" containerID="c99f123d692a8034d3e4c462923718603d75739d7841ee3b04001e43eda6e4a2" Sep 30 14:25:51 crc kubenswrapper[4936]: E0930 14:25:51.543882 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c99f123d692a8034d3e4c462923718603d75739d7841ee3b04001e43eda6e4a2\": container with ID starting with c99f123d692a8034d3e4c462923718603d75739d7841ee3b04001e43eda6e4a2 not found: ID does not exist" containerID="c99f123d692a8034d3e4c462923718603d75739d7841ee3b04001e43eda6e4a2" Sep 30 14:25:51 crc kubenswrapper[4936]: I0930 14:25:51.543912 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c99f123d692a8034d3e4c462923718603d75739d7841ee3b04001e43eda6e4a2"} err="failed to get container status \"c99f123d692a8034d3e4c462923718603d75739d7841ee3b04001e43eda6e4a2\": rpc error: code = NotFound desc = could not find container \"c99f123d692a8034d3e4c462923718603d75739d7841ee3b04001e43eda6e4a2\": container with ID starting with c99f123d692a8034d3e4c462923718603d75739d7841ee3b04001e43eda6e4a2 not found: ID does not exist" Sep 30 14:25:51 crc kubenswrapper[4936]: I0930 14:25:51.543933 4936 scope.go:117] "RemoveContainer" containerID="f8352510e5628d5a6daa41551646967d03b8a4ff775819f2d3bde955f258b263" Sep 30 14:25:51 crc kubenswrapper[4936]: E0930 14:25:51.544143 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8352510e5628d5a6daa41551646967d03b8a4ff775819f2d3bde955f258b263\": container with ID starting with f8352510e5628d5a6daa41551646967d03b8a4ff775819f2d3bde955f258b263 not found: ID does not exist" containerID="f8352510e5628d5a6daa41551646967d03b8a4ff775819f2d3bde955f258b263" Sep 30 14:25:51 crc kubenswrapper[4936]: I0930 14:25:51.544166 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8352510e5628d5a6daa41551646967d03b8a4ff775819f2d3bde955f258b263"} err="failed to get container status \"f8352510e5628d5a6daa41551646967d03b8a4ff775819f2d3bde955f258b263\": rpc error: code = NotFound desc = could not find container \"f8352510e5628d5a6daa41551646967d03b8a4ff775819f2d3bde955f258b263\": container with ID starting with f8352510e5628d5a6daa41551646967d03b8a4ff775819f2d3bde955f258b263 not found: ID does not exist" Sep 30 14:25:51 crc kubenswrapper[4936]: I0930 14:25:51.544178 4936 scope.go:117] "RemoveContainer" containerID="457fa289c8b80ee063574a7e97858e6f7341f9a95619ed28ce237b7e00907f3f" Sep 30 14:25:51 crc kubenswrapper[4936]: E0930 14:25:51.544360 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"457fa289c8b80ee063574a7e97858e6f7341f9a95619ed28ce237b7e00907f3f\": container with ID starting with 457fa289c8b80ee063574a7e97858e6f7341f9a95619ed28ce237b7e00907f3f not found: ID does not exist" containerID="457fa289c8b80ee063574a7e97858e6f7341f9a95619ed28ce237b7e00907f3f" Sep 30 14:25:51 crc kubenswrapper[4936]: I0930 14:25:51.544377 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"457fa289c8b80ee063574a7e97858e6f7341f9a95619ed28ce237b7e00907f3f"} err="failed to get container status \"457fa289c8b80ee063574a7e97858e6f7341f9a95619ed28ce237b7e00907f3f\": rpc error: code = NotFound desc = could not find container \"457fa289c8b80ee063574a7e97858e6f7341f9a95619ed28ce237b7e00907f3f\": container with ID starting with 457fa289c8b80ee063574a7e97858e6f7341f9a95619ed28ce237b7e00907f3f not found: ID does not exist" Sep 30 14:25:52 crc kubenswrapper[4936]: I0930 14:25:52.324812 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7136dfa-d7f6-4aab-916a-52055fe67f4c" path="/var/lib/kubelet/pods/a7136dfa-d7f6-4aab-916a-52055fe67f4c/volumes" Sep 30 14:26:18 crc kubenswrapper[4936]: I0930 14:26:18.249738 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:26:18 crc kubenswrapper[4936]: I0930 14:26:18.250313 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:26:48 crc kubenswrapper[4936]: I0930 14:26:48.250245 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:26:48 crc kubenswrapper[4936]: I0930 14:26:48.250951 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:27:18 crc kubenswrapper[4936]: I0930 14:27:18.250612 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:27:18 crc kubenswrapper[4936]: I0930 14:27:18.251047 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:27:18 crc kubenswrapper[4936]: I0930 14:27:18.251090 4936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 14:27:18 crc kubenswrapper[4936]: I0930 14:27:18.251676 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d"} pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:27:18 crc kubenswrapper[4936]: I0930 14:27:18.251731 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" containerID="cri-o://5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" gracePeriod=600 Sep 30 14:27:18 crc kubenswrapper[4936]: E0930 14:27:18.375619 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:27:19 crc kubenswrapper[4936]: I0930 14:27:19.130904 4936 generic.go:334] "Generic (PLEG): container finished" podID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" exitCode=0 Sep 30 14:27:19 crc kubenswrapper[4936]: I0930 14:27:19.130977 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerDied","Data":"5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d"} Sep 30 14:27:19 crc kubenswrapper[4936]: I0930 14:27:19.131587 4936 scope.go:117] "RemoveContainer" containerID="bb1898fa06f7887a4f4e96d8737c5ec1fabdd689dd373ee13565538229cbbde3" Sep 30 14:27:19 crc kubenswrapper[4936]: I0930 14:27:19.133181 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:27:19 crc kubenswrapper[4936]: E0930 14:27:19.133514 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:27:30 crc kubenswrapper[4936]: I0930 14:27:30.322015 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:27:30 crc kubenswrapper[4936]: E0930 14:27:30.322839 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:27:42 crc kubenswrapper[4936]: I0930 14:27:42.316210 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:27:42 crc kubenswrapper[4936]: E0930 14:27:42.317518 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:27:53 crc kubenswrapper[4936]: I0930 14:27:53.316067 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:27:53 crc kubenswrapper[4936]: E0930 14:27:53.317591 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:28:04 crc kubenswrapper[4936]: I0930 14:28:04.316117 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:28:04 crc kubenswrapper[4936]: E0930 14:28:04.316907 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:28:18 crc kubenswrapper[4936]: I0930 14:28:18.315885 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:28:18 crc kubenswrapper[4936]: E0930 14:28:18.316677 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:28:30 crc kubenswrapper[4936]: I0930 14:28:30.320610 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:28:30 crc kubenswrapper[4936]: E0930 14:28:30.321227 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:28:30 crc kubenswrapper[4936]: I0930 14:28:30.790489 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-76fdt"] Sep 30 14:28:30 crc kubenswrapper[4936]: E0930 14:28:30.791254 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7136dfa-d7f6-4aab-916a-52055fe67f4c" containerName="registry-server" Sep 30 14:28:30 crc kubenswrapper[4936]: I0930 14:28:30.791272 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7136dfa-d7f6-4aab-916a-52055fe67f4c" containerName="registry-server" Sep 30 14:28:30 crc kubenswrapper[4936]: E0930 14:28:30.791286 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7136dfa-d7f6-4aab-916a-52055fe67f4c" containerName="extract-utilities" Sep 30 14:28:30 crc kubenswrapper[4936]: I0930 14:28:30.791295 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7136dfa-d7f6-4aab-916a-52055fe67f4c" containerName="extract-utilities" Sep 30 14:28:30 crc kubenswrapper[4936]: E0930 14:28:30.791317 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7136dfa-d7f6-4aab-916a-52055fe67f4c" containerName="extract-content" Sep 30 14:28:30 crc kubenswrapper[4936]: I0930 14:28:30.791325 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7136dfa-d7f6-4aab-916a-52055fe67f4c" containerName="extract-content" Sep 30 14:28:30 crc kubenswrapper[4936]: I0930 14:28:30.791560 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7136dfa-d7f6-4aab-916a-52055fe67f4c" containerName="registry-server" Sep 30 14:28:30 crc kubenswrapper[4936]: I0930 14:28:30.793079 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76fdt" Sep 30 14:28:30 crc kubenswrapper[4936]: I0930 14:28:30.854680 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-76fdt"] Sep 30 14:28:30 crc kubenswrapper[4936]: I0930 14:28:30.952304 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d6b9a09-9c7b-48a2-b9a2-7af83327338c-utilities\") pod \"redhat-marketplace-76fdt\" (UID: \"5d6b9a09-9c7b-48a2-b9a2-7af83327338c\") " pod="openshift-marketplace/redhat-marketplace-76fdt" Sep 30 14:28:30 crc kubenswrapper[4936]: I0930 14:28:30.952511 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkqs6\" (UniqueName: \"kubernetes.io/projected/5d6b9a09-9c7b-48a2-b9a2-7af83327338c-kube-api-access-tkqs6\") pod \"redhat-marketplace-76fdt\" (UID: \"5d6b9a09-9c7b-48a2-b9a2-7af83327338c\") " pod="openshift-marketplace/redhat-marketplace-76fdt" Sep 30 14:28:30 crc kubenswrapper[4936]: I0930 14:28:30.952661 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d6b9a09-9c7b-48a2-b9a2-7af83327338c-catalog-content\") pod \"redhat-marketplace-76fdt\" (UID: \"5d6b9a09-9c7b-48a2-b9a2-7af83327338c\") " pod="openshift-marketplace/redhat-marketplace-76fdt" Sep 30 14:28:31 crc kubenswrapper[4936]: I0930 14:28:31.056256 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkqs6\" (UniqueName: \"kubernetes.io/projected/5d6b9a09-9c7b-48a2-b9a2-7af83327338c-kube-api-access-tkqs6\") pod \"redhat-marketplace-76fdt\" (UID: \"5d6b9a09-9c7b-48a2-b9a2-7af83327338c\") " pod="openshift-marketplace/redhat-marketplace-76fdt" Sep 30 14:28:31 crc kubenswrapper[4936]: I0930 14:28:31.056311 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d6b9a09-9c7b-48a2-b9a2-7af83327338c-utilities\") pod \"redhat-marketplace-76fdt\" (UID: \"5d6b9a09-9c7b-48a2-b9a2-7af83327338c\") " pod="openshift-marketplace/redhat-marketplace-76fdt" Sep 30 14:28:31 crc kubenswrapper[4936]: I0930 14:28:31.056381 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d6b9a09-9c7b-48a2-b9a2-7af83327338c-catalog-content\") pod \"redhat-marketplace-76fdt\" (UID: \"5d6b9a09-9c7b-48a2-b9a2-7af83327338c\") " pod="openshift-marketplace/redhat-marketplace-76fdt" Sep 30 14:28:31 crc kubenswrapper[4936]: I0930 14:28:31.056772 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d6b9a09-9c7b-48a2-b9a2-7af83327338c-utilities\") pod \"redhat-marketplace-76fdt\" (UID: \"5d6b9a09-9c7b-48a2-b9a2-7af83327338c\") " pod="openshift-marketplace/redhat-marketplace-76fdt" Sep 30 14:28:31 crc kubenswrapper[4936]: I0930 14:28:31.056902 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d6b9a09-9c7b-48a2-b9a2-7af83327338c-catalog-content\") pod \"redhat-marketplace-76fdt\" (UID: \"5d6b9a09-9c7b-48a2-b9a2-7af83327338c\") " pod="openshift-marketplace/redhat-marketplace-76fdt" Sep 30 14:28:31 crc kubenswrapper[4936]: I0930 14:28:31.078935 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkqs6\" (UniqueName: \"kubernetes.io/projected/5d6b9a09-9c7b-48a2-b9a2-7af83327338c-kube-api-access-tkqs6\") pod \"redhat-marketplace-76fdt\" (UID: \"5d6b9a09-9c7b-48a2-b9a2-7af83327338c\") " pod="openshift-marketplace/redhat-marketplace-76fdt" Sep 30 14:28:31 crc kubenswrapper[4936]: I0930 14:28:31.123310 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76fdt" Sep 30 14:28:31 crc kubenswrapper[4936]: I0930 14:28:31.598534 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-76fdt"] Sep 30 14:28:31 crc kubenswrapper[4936]: I0930 14:28:31.798195 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76fdt" event={"ID":"5d6b9a09-9c7b-48a2-b9a2-7af83327338c","Type":"ContainerStarted","Data":"cb1534159e4c7260c4f43b1b8b11546bc82942702137da2aaa30f16fb65dd6c6"} Sep 30 14:28:31 crc kubenswrapper[4936]: I0930 14:28:31.798757 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76fdt" event={"ID":"5d6b9a09-9c7b-48a2-b9a2-7af83327338c","Type":"ContainerStarted","Data":"7291b88e0fa96ac0939c9675166773d993fbfc7471579c118a96bb84df7cd50d"} Sep 30 14:28:32 crc kubenswrapper[4936]: I0930 14:28:32.815564 4936 generic.go:334] "Generic (PLEG): container finished" podID="5d6b9a09-9c7b-48a2-b9a2-7af83327338c" containerID="cb1534159e4c7260c4f43b1b8b11546bc82942702137da2aaa30f16fb65dd6c6" exitCode=0 Sep 30 14:28:32 crc kubenswrapper[4936]: I0930 14:28:32.815987 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76fdt" event={"ID":"5d6b9a09-9c7b-48a2-b9a2-7af83327338c","Type":"ContainerDied","Data":"cb1534159e4c7260c4f43b1b8b11546bc82942702137da2aaa30f16fb65dd6c6"} Sep 30 14:28:34 crc kubenswrapper[4936]: I0930 14:28:34.832480 4936 generic.go:334] "Generic (PLEG): container finished" podID="5d6b9a09-9c7b-48a2-b9a2-7af83327338c" containerID="3c47f76042963ccd6eb9d10b6ea6c837dd604c28feec783467c02467ed7cd3ae" exitCode=0 Sep 30 14:28:34 crc kubenswrapper[4936]: I0930 14:28:34.832566 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76fdt" event={"ID":"5d6b9a09-9c7b-48a2-b9a2-7af83327338c","Type":"ContainerDied","Data":"3c47f76042963ccd6eb9d10b6ea6c837dd604c28feec783467c02467ed7cd3ae"} Sep 30 14:28:35 crc kubenswrapper[4936]: I0930 14:28:35.844441 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76fdt" event={"ID":"5d6b9a09-9c7b-48a2-b9a2-7af83327338c","Type":"ContainerStarted","Data":"0ad21be9f7c78b043ac05a1caab946ae026b0a165276887bb1298369fe8e1412"} Sep 30 14:28:41 crc kubenswrapper[4936]: I0930 14:28:41.124284 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-76fdt" Sep 30 14:28:41 crc kubenswrapper[4936]: I0930 14:28:41.124926 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-76fdt" Sep 30 14:28:41 crc kubenswrapper[4936]: I0930 14:28:41.176765 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-76fdt" Sep 30 14:28:41 crc kubenswrapper[4936]: I0930 14:28:41.201098 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-76fdt" podStartSLOduration=8.700135939 podStartE2EDuration="11.201080736s" podCreationTimestamp="2025-09-30 14:28:30 +0000 UTC" firstStartedPulling="2025-09-30 14:28:32.817366363 +0000 UTC m=+2963.201368664" lastFinishedPulling="2025-09-30 14:28:35.31831116 +0000 UTC m=+2965.702313461" observedRunningTime="2025-09-30 14:28:35.884010532 +0000 UTC m=+2966.268012853" watchObservedRunningTime="2025-09-30 14:28:41.201080736 +0000 UTC m=+2971.585083037" Sep 30 14:28:41 crc kubenswrapper[4936]: I0930 14:28:41.931329 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-76fdt" Sep 30 14:28:41 crc kubenswrapper[4936]: I0930 14:28:41.978182 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-76fdt"] Sep 30 14:28:43 crc kubenswrapper[4936]: I0930 14:28:43.904358 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-76fdt" podUID="5d6b9a09-9c7b-48a2-b9a2-7af83327338c" containerName="registry-server" containerID="cri-o://0ad21be9f7c78b043ac05a1caab946ae026b0a165276887bb1298369fe8e1412" gracePeriod=2 Sep 30 14:28:44 crc kubenswrapper[4936]: I0930 14:28:44.315618 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:28:44 crc kubenswrapper[4936]: E0930 14:28:44.316099 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:28:44 crc kubenswrapper[4936]: I0930 14:28:44.334965 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76fdt" Sep 30 14:28:44 crc kubenswrapper[4936]: I0930 14:28:44.401967 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d6b9a09-9c7b-48a2-b9a2-7af83327338c-utilities\") pod \"5d6b9a09-9c7b-48a2-b9a2-7af83327338c\" (UID: \"5d6b9a09-9c7b-48a2-b9a2-7af83327338c\") " Sep 30 14:28:44 crc kubenswrapper[4936]: I0930 14:28:44.402024 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkqs6\" (UniqueName: \"kubernetes.io/projected/5d6b9a09-9c7b-48a2-b9a2-7af83327338c-kube-api-access-tkqs6\") pod \"5d6b9a09-9c7b-48a2-b9a2-7af83327338c\" (UID: \"5d6b9a09-9c7b-48a2-b9a2-7af83327338c\") " Sep 30 14:28:44 crc kubenswrapper[4936]: I0930 14:28:44.402229 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d6b9a09-9c7b-48a2-b9a2-7af83327338c-catalog-content\") pod \"5d6b9a09-9c7b-48a2-b9a2-7af83327338c\" (UID: \"5d6b9a09-9c7b-48a2-b9a2-7af83327338c\") " Sep 30 14:28:44 crc kubenswrapper[4936]: I0930 14:28:44.403688 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d6b9a09-9c7b-48a2-b9a2-7af83327338c-utilities" (OuterVolumeSpecName: "utilities") pod "5d6b9a09-9c7b-48a2-b9a2-7af83327338c" (UID: "5d6b9a09-9c7b-48a2-b9a2-7af83327338c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:28:44 crc kubenswrapper[4936]: I0930 14:28:44.408939 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d6b9a09-9c7b-48a2-b9a2-7af83327338c-kube-api-access-tkqs6" (OuterVolumeSpecName: "kube-api-access-tkqs6") pod "5d6b9a09-9c7b-48a2-b9a2-7af83327338c" (UID: "5d6b9a09-9c7b-48a2-b9a2-7af83327338c"). InnerVolumeSpecName "kube-api-access-tkqs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:28:44 crc kubenswrapper[4936]: I0930 14:28:44.426168 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d6b9a09-9c7b-48a2-b9a2-7af83327338c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d6b9a09-9c7b-48a2-b9a2-7af83327338c" (UID: "5d6b9a09-9c7b-48a2-b9a2-7af83327338c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:28:44 crc kubenswrapper[4936]: I0930 14:28:44.505455 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d6b9a09-9c7b-48a2-b9a2-7af83327338c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:28:44 crc kubenswrapper[4936]: I0930 14:28:44.505515 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d6b9a09-9c7b-48a2-b9a2-7af83327338c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:28:44 crc kubenswrapper[4936]: I0930 14:28:44.505530 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkqs6\" (UniqueName: \"kubernetes.io/projected/5d6b9a09-9c7b-48a2-b9a2-7af83327338c-kube-api-access-tkqs6\") on node \"crc\" DevicePath \"\"" Sep 30 14:28:44 crc kubenswrapper[4936]: I0930 14:28:44.915515 4936 generic.go:334] "Generic (PLEG): container finished" podID="5d6b9a09-9c7b-48a2-b9a2-7af83327338c" containerID="0ad21be9f7c78b043ac05a1caab946ae026b0a165276887bb1298369fe8e1412" exitCode=0 Sep 30 14:28:44 crc kubenswrapper[4936]: I0930 14:28:44.915573 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76fdt" event={"ID":"5d6b9a09-9c7b-48a2-b9a2-7af83327338c","Type":"ContainerDied","Data":"0ad21be9f7c78b043ac05a1caab946ae026b0a165276887bb1298369fe8e1412"} Sep 30 14:28:44 crc kubenswrapper[4936]: I0930 14:28:44.915627 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76fdt" event={"ID":"5d6b9a09-9c7b-48a2-b9a2-7af83327338c","Type":"ContainerDied","Data":"7291b88e0fa96ac0939c9675166773d993fbfc7471579c118a96bb84df7cd50d"} Sep 30 14:28:44 crc kubenswrapper[4936]: I0930 14:28:44.915647 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76fdt" Sep 30 14:28:44 crc kubenswrapper[4936]: I0930 14:28:44.915694 4936 scope.go:117] "RemoveContainer" containerID="0ad21be9f7c78b043ac05a1caab946ae026b0a165276887bb1298369fe8e1412" Sep 30 14:28:44 crc kubenswrapper[4936]: I0930 14:28:44.960237 4936 scope.go:117] "RemoveContainer" containerID="3c47f76042963ccd6eb9d10b6ea6c837dd604c28feec783467c02467ed7cd3ae" Sep 30 14:28:44 crc kubenswrapper[4936]: I0930 14:28:44.971098 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-76fdt"] Sep 30 14:28:44 crc kubenswrapper[4936]: I0930 14:28:44.984920 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-76fdt"] Sep 30 14:28:44 crc kubenswrapper[4936]: I0930 14:28:44.999084 4936 scope.go:117] "RemoveContainer" containerID="cb1534159e4c7260c4f43b1b8b11546bc82942702137da2aaa30f16fb65dd6c6" Sep 30 14:28:45 crc kubenswrapper[4936]: I0930 14:28:45.057260 4936 scope.go:117] "RemoveContainer" containerID="0ad21be9f7c78b043ac05a1caab946ae026b0a165276887bb1298369fe8e1412" Sep 30 14:28:45 crc kubenswrapper[4936]: E0930 14:28:45.057844 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ad21be9f7c78b043ac05a1caab946ae026b0a165276887bb1298369fe8e1412\": container with ID starting with 0ad21be9f7c78b043ac05a1caab946ae026b0a165276887bb1298369fe8e1412 not found: ID does not exist" containerID="0ad21be9f7c78b043ac05a1caab946ae026b0a165276887bb1298369fe8e1412" Sep 30 14:28:45 crc kubenswrapper[4936]: I0930 14:28:45.057879 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad21be9f7c78b043ac05a1caab946ae026b0a165276887bb1298369fe8e1412"} err="failed to get container status \"0ad21be9f7c78b043ac05a1caab946ae026b0a165276887bb1298369fe8e1412\": rpc error: code = NotFound desc = could not find container \"0ad21be9f7c78b043ac05a1caab946ae026b0a165276887bb1298369fe8e1412\": container with ID starting with 0ad21be9f7c78b043ac05a1caab946ae026b0a165276887bb1298369fe8e1412 not found: ID does not exist" Sep 30 14:28:45 crc kubenswrapper[4936]: I0930 14:28:45.057906 4936 scope.go:117] "RemoveContainer" containerID="3c47f76042963ccd6eb9d10b6ea6c837dd604c28feec783467c02467ed7cd3ae" Sep 30 14:28:45 crc kubenswrapper[4936]: E0930 14:28:45.058212 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c47f76042963ccd6eb9d10b6ea6c837dd604c28feec783467c02467ed7cd3ae\": container with ID starting with 3c47f76042963ccd6eb9d10b6ea6c837dd604c28feec783467c02467ed7cd3ae not found: ID does not exist" containerID="3c47f76042963ccd6eb9d10b6ea6c837dd604c28feec783467c02467ed7cd3ae" Sep 30 14:28:45 crc kubenswrapper[4936]: I0930 14:28:45.058232 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c47f76042963ccd6eb9d10b6ea6c837dd604c28feec783467c02467ed7cd3ae"} err="failed to get container status \"3c47f76042963ccd6eb9d10b6ea6c837dd604c28feec783467c02467ed7cd3ae\": rpc error: code = NotFound desc = could not find container \"3c47f76042963ccd6eb9d10b6ea6c837dd604c28feec783467c02467ed7cd3ae\": container with ID starting with 3c47f76042963ccd6eb9d10b6ea6c837dd604c28feec783467c02467ed7cd3ae not found: ID does not exist" Sep 30 14:28:45 crc kubenswrapper[4936]: I0930 14:28:45.058245 4936 scope.go:117] "RemoveContainer" containerID="cb1534159e4c7260c4f43b1b8b11546bc82942702137da2aaa30f16fb65dd6c6" Sep 30 14:28:45 crc kubenswrapper[4936]: E0930 14:28:45.059044 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb1534159e4c7260c4f43b1b8b11546bc82942702137da2aaa30f16fb65dd6c6\": container with ID starting with cb1534159e4c7260c4f43b1b8b11546bc82942702137da2aaa30f16fb65dd6c6 not found: ID does not exist" containerID="cb1534159e4c7260c4f43b1b8b11546bc82942702137da2aaa30f16fb65dd6c6" Sep 30 14:28:45 crc kubenswrapper[4936]: I0930 14:28:45.059064 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1534159e4c7260c4f43b1b8b11546bc82942702137da2aaa30f16fb65dd6c6"} err="failed to get container status \"cb1534159e4c7260c4f43b1b8b11546bc82942702137da2aaa30f16fb65dd6c6\": rpc error: code = NotFound desc = could not find container \"cb1534159e4c7260c4f43b1b8b11546bc82942702137da2aaa30f16fb65dd6c6\": container with ID starting with cb1534159e4c7260c4f43b1b8b11546bc82942702137da2aaa30f16fb65dd6c6 not found: ID does not exist" Sep 30 14:28:46 crc kubenswrapper[4936]: I0930 14:28:46.325137 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d6b9a09-9c7b-48a2-b9a2-7af83327338c" path="/var/lib/kubelet/pods/5d6b9a09-9c7b-48a2-b9a2-7af83327338c/volumes" Sep 30 14:28:56 crc kubenswrapper[4936]: I0930 14:28:56.315488 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:28:56 crc kubenswrapper[4936]: E0930 14:28:56.316441 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:29:10 crc kubenswrapper[4936]: I0930 14:29:10.320883 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:29:10 crc kubenswrapper[4936]: E0930 14:29:10.322686 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:29:23 crc kubenswrapper[4936]: I0930 14:29:23.315200 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:29:23 crc kubenswrapper[4936]: E0930 14:29:23.316070 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:29:38 crc kubenswrapper[4936]: I0930 14:29:38.316433 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:29:38 crc kubenswrapper[4936]: E0930 14:29:38.317494 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:29:50 crc kubenswrapper[4936]: I0930 14:29:50.321957 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:29:50 crc kubenswrapper[4936]: E0930 14:29:50.322788 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:30:00 crc kubenswrapper[4936]: I0930 14:30:00.186719 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320710-2wlkt"] Sep 30 14:30:00 crc kubenswrapper[4936]: E0930 14:30:00.187705 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d6b9a09-9c7b-48a2-b9a2-7af83327338c" containerName="registry-server" Sep 30 14:30:00 crc kubenswrapper[4936]: I0930 14:30:00.187724 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d6b9a09-9c7b-48a2-b9a2-7af83327338c" containerName="registry-server" Sep 30 14:30:00 crc kubenswrapper[4936]: E0930 14:30:00.187760 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d6b9a09-9c7b-48a2-b9a2-7af83327338c" containerName="extract-utilities" Sep 30 14:30:00 crc kubenswrapper[4936]: I0930 14:30:00.187768 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d6b9a09-9c7b-48a2-b9a2-7af83327338c" containerName="extract-utilities" Sep 30 14:30:00 crc kubenswrapper[4936]: E0930 14:30:00.187783 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d6b9a09-9c7b-48a2-b9a2-7af83327338c" containerName="extract-content" Sep 30 14:30:00 crc kubenswrapper[4936]: I0930 14:30:00.187789 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d6b9a09-9c7b-48a2-b9a2-7af83327338c" containerName="extract-content" Sep 30 14:30:00 crc kubenswrapper[4936]: I0930 14:30:00.187970 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d6b9a09-9c7b-48a2-b9a2-7af83327338c" containerName="registry-server" Sep 30 14:30:00 crc kubenswrapper[4936]: I0930 14:30:00.188685 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-2wlkt" Sep 30 14:30:00 crc kubenswrapper[4936]: I0930 14:30:00.191454 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 14:30:00 crc kubenswrapper[4936]: I0930 14:30:00.199752 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320710-2wlkt"] Sep 30 14:30:00 crc kubenswrapper[4936]: I0930 14:30:00.202659 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 14:30:00 crc kubenswrapper[4936]: I0930 14:30:00.326334 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b068d261-097f-44dd-af05-16be8300793e-secret-volume\") pod \"collect-profiles-29320710-2wlkt\" (UID: \"b068d261-097f-44dd-af05-16be8300793e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-2wlkt" Sep 30 14:30:00 crc kubenswrapper[4936]: I0930 14:30:00.326449 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b068d261-097f-44dd-af05-16be8300793e-config-volume\") pod \"collect-profiles-29320710-2wlkt\" (UID: \"b068d261-097f-44dd-af05-16be8300793e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-2wlkt" Sep 30 14:30:00 crc kubenswrapper[4936]: I0930 14:30:00.326639 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqbrl\" (UniqueName: \"kubernetes.io/projected/b068d261-097f-44dd-af05-16be8300793e-kube-api-access-tqbrl\") pod \"collect-profiles-29320710-2wlkt\" (UID: \"b068d261-097f-44dd-af05-16be8300793e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-2wlkt" Sep 30 14:30:00 crc kubenswrapper[4936]: I0930 14:30:00.428552 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b068d261-097f-44dd-af05-16be8300793e-secret-volume\") pod \"collect-profiles-29320710-2wlkt\" (UID: \"b068d261-097f-44dd-af05-16be8300793e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-2wlkt" Sep 30 14:30:00 crc kubenswrapper[4936]: I0930 14:30:00.428609 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b068d261-097f-44dd-af05-16be8300793e-config-volume\") pod \"collect-profiles-29320710-2wlkt\" (UID: \"b068d261-097f-44dd-af05-16be8300793e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-2wlkt" Sep 30 14:30:00 crc kubenswrapper[4936]: I0930 14:30:00.428680 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqbrl\" (UniqueName: \"kubernetes.io/projected/b068d261-097f-44dd-af05-16be8300793e-kube-api-access-tqbrl\") pod \"collect-profiles-29320710-2wlkt\" (UID: \"b068d261-097f-44dd-af05-16be8300793e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-2wlkt" Sep 30 14:30:00 crc kubenswrapper[4936]: I0930 14:30:00.429812 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b068d261-097f-44dd-af05-16be8300793e-config-volume\") pod \"collect-profiles-29320710-2wlkt\" (UID: \"b068d261-097f-44dd-af05-16be8300793e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-2wlkt" Sep 30 14:30:00 crc kubenswrapper[4936]: I0930 14:30:00.435999 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b068d261-097f-44dd-af05-16be8300793e-secret-volume\") pod \"collect-profiles-29320710-2wlkt\" (UID: \"b068d261-097f-44dd-af05-16be8300793e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-2wlkt" Sep 30 14:30:00 crc kubenswrapper[4936]: I0930 14:30:00.445401 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqbrl\" (UniqueName: \"kubernetes.io/projected/b068d261-097f-44dd-af05-16be8300793e-kube-api-access-tqbrl\") pod \"collect-profiles-29320710-2wlkt\" (UID: \"b068d261-097f-44dd-af05-16be8300793e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-2wlkt" Sep 30 14:30:00 crc kubenswrapper[4936]: I0930 14:30:00.507516 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-2wlkt" Sep 30 14:30:00 crc kubenswrapper[4936]: I0930 14:30:00.957372 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320710-2wlkt"] Sep 30 14:30:01 crc kubenswrapper[4936]: I0930 14:30:01.507231 4936 generic.go:334] "Generic (PLEG): container finished" podID="b068d261-097f-44dd-af05-16be8300793e" containerID="493c8bca1e6747155ec3d4e1d7d1ca2786422de4416703cbf87a502442abf0b2" exitCode=0 Sep 30 14:30:01 crc kubenswrapper[4936]: I0930 14:30:01.507442 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-2wlkt" event={"ID":"b068d261-097f-44dd-af05-16be8300793e","Type":"ContainerDied","Data":"493c8bca1e6747155ec3d4e1d7d1ca2786422de4416703cbf87a502442abf0b2"} Sep 30 14:30:01 crc kubenswrapper[4936]: I0930 14:30:01.509654 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-2wlkt" event={"ID":"b068d261-097f-44dd-af05-16be8300793e","Type":"ContainerStarted","Data":"0dd87ac8dfc5701e1e09e4d220db74018f6aacf3f58be9433ff1f11416d13dc5"} Sep 30 14:30:02 crc kubenswrapper[4936]: I0930 14:30:02.825971 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-2wlkt" Sep 30 14:30:02 crc kubenswrapper[4936]: I0930 14:30:02.971909 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b068d261-097f-44dd-af05-16be8300793e-config-volume\") pod \"b068d261-097f-44dd-af05-16be8300793e\" (UID: \"b068d261-097f-44dd-af05-16be8300793e\") " Sep 30 14:30:02 crc kubenswrapper[4936]: I0930 14:30:02.972074 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqbrl\" (UniqueName: \"kubernetes.io/projected/b068d261-097f-44dd-af05-16be8300793e-kube-api-access-tqbrl\") pod \"b068d261-097f-44dd-af05-16be8300793e\" (UID: \"b068d261-097f-44dd-af05-16be8300793e\") " Sep 30 14:30:02 crc kubenswrapper[4936]: I0930 14:30:02.972158 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b068d261-097f-44dd-af05-16be8300793e-secret-volume\") pod \"b068d261-097f-44dd-af05-16be8300793e\" (UID: \"b068d261-097f-44dd-af05-16be8300793e\") " Sep 30 14:30:02 crc kubenswrapper[4936]: I0930 14:30:02.972776 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b068d261-097f-44dd-af05-16be8300793e-config-volume" (OuterVolumeSpecName: "config-volume") pod "b068d261-097f-44dd-af05-16be8300793e" (UID: "b068d261-097f-44dd-af05-16be8300793e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:30:02 crc kubenswrapper[4936]: I0930 14:30:02.977314 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b068d261-097f-44dd-af05-16be8300793e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b068d261-097f-44dd-af05-16be8300793e" (UID: "b068d261-097f-44dd-af05-16be8300793e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:30:02 crc kubenswrapper[4936]: I0930 14:30:02.979964 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b068d261-097f-44dd-af05-16be8300793e-kube-api-access-tqbrl" (OuterVolumeSpecName: "kube-api-access-tqbrl") pod "b068d261-097f-44dd-af05-16be8300793e" (UID: "b068d261-097f-44dd-af05-16be8300793e"). InnerVolumeSpecName "kube-api-access-tqbrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:30:03 crc kubenswrapper[4936]: I0930 14:30:03.073633 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqbrl\" (UniqueName: \"kubernetes.io/projected/b068d261-097f-44dd-af05-16be8300793e-kube-api-access-tqbrl\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:03 crc kubenswrapper[4936]: I0930 14:30:03.073669 4936 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b068d261-097f-44dd-af05-16be8300793e-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:03 crc kubenswrapper[4936]: I0930 14:30:03.073681 4936 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b068d261-097f-44dd-af05-16be8300793e-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:03 crc kubenswrapper[4936]: I0930 14:30:03.526923 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-2wlkt" event={"ID":"b068d261-097f-44dd-af05-16be8300793e","Type":"ContainerDied","Data":"0dd87ac8dfc5701e1e09e4d220db74018f6aacf3f58be9433ff1f11416d13dc5"} Sep 30 14:30:03 crc kubenswrapper[4936]: I0930 14:30:03.526972 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dd87ac8dfc5701e1e09e4d220db74018f6aacf3f58be9433ff1f11416d13dc5" Sep 30 14:30:03 crc kubenswrapper[4936]: I0930 14:30:03.526972 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-2wlkt" Sep 30 14:30:03 crc kubenswrapper[4936]: I0930 14:30:03.914797 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk"] Sep 30 14:30:03 crc kubenswrapper[4936]: I0930 14:30:03.924564 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320665-xjlsk"] Sep 30 14:30:04 crc kubenswrapper[4936]: I0930 14:30:04.328102 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b33ff50-1265-41e6-9b6a-d526726f71cb" path="/var/lib/kubelet/pods/0b33ff50-1265-41e6-9b6a-d526726f71cb/volumes" Sep 30 14:30:05 crc kubenswrapper[4936]: I0930 14:30:05.315884 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:30:05 crc kubenswrapper[4936]: E0930 14:30:05.316530 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:30:16 crc kubenswrapper[4936]: I0930 14:30:16.315199 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:30:16 crc kubenswrapper[4936]: E0930 14:30:16.316052 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:30:16 crc kubenswrapper[4936]: I0930 14:30:16.625859 4936 generic.go:334] "Generic (PLEG): container finished" podID="acb77378-b2f6-48a5-b156-0c983ebde855" containerID="57afe8e397965dd447ec214cc4ccaa01a1697380664736d95a8f006eb79a2a0d" exitCode=0 Sep 30 14:30:16 crc kubenswrapper[4936]: I0930 14:30:16.625909 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" event={"ID":"acb77378-b2f6-48a5-b156-0c983ebde855","Type":"ContainerDied","Data":"57afe8e397965dd447ec214cc4ccaa01a1697380664736d95a8f006eb79a2a0d"} Sep 30 14:30:17 crc kubenswrapper[4936]: I0930 14:30:17.991797 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.088766 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-ssh-key\") pod \"acb77378-b2f6-48a5-b156-0c983ebde855\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.089047 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-libvirt-combined-ca-bundle\") pod \"acb77378-b2f6-48a5-b156-0c983ebde855\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.089145 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-inventory\") pod \"acb77378-b2f6-48a5-b156-0c983ebde855\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.089271 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm5lh\" (UniqueName: \"kubernetes.io/projected/acb77378-b2f6-48a5-b156-0c983ebde855-kube-api-access-rm5lh\") pod \"acb77378-b2f6-48a5-b156-0c983ebde855\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.089461 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-libvirt-secret-0\") pod \"acb77378-b2f6-48a5-b156-0c983ebde855\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.089678 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-ceph\") pod \"acb77378-b2f6-48a5-b156-0c983ebde855\" (UID: \"acb77378-b2f6-48a5-b156-0c983ebde855\") " Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.105560 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acb77378-b2f6-48a5-b156-0c983ebde855-kube-api-access-rm5lh" (OuterVolumeSpecName: "kube-api-access-rm5lh") pod "acb77378-b2f6-48a5-b156-0c983ebde855" (UID: "acb77378-b2f6-48a5-b156-0c983ebde855"). InnerVolumeSpecName "kube-api-access-rm5lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.119382 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "acb77378-b2f6-48a5-b156-0c983ebde855" (UID: "acb77378-b2f6-48a5-b156-0c983ebde855"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.119455 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-ceph" (OuterVolumeSpecName: "ceph") pod "acb77378-b2f6-48a5-b156-0c983ebde855" (UID: "acb77378-b2f6-48a5-b156-0c983ebde855"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.120705 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "acb77378-b2f6-48a5-b156-0c983ebde855" (UID: "acb77378-b2f6-48a5-b156-0c983ebde855"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.135215 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "acb77378-b2f6-48a5-b156-0c983ebde855" (UID: "acb77378-b2f6-48a5-b156-0c983ebde855"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.140504 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-inventory" (OuterVolumeSpecName: "inventory") pod "acb77378-b2f6-48a5-b156-0c983ebde855" (UID: "acb77378-b2f6-48a5-b156-0c983ebde855"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.191949 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.191980 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.191991 4936 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.192000 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.192008 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm5lh\" (UniqueName: \"kubernetes.io/projected/acb77378-b2f6-48a5-b156-0c983ebde855-kube-api-access-rm5lh\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.192017 4936 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/acb77378-b2f6-48a5-b156-0c983ebde855-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.643037 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" event={"ID":"acb77378-b2f6-48a5-b156-0c983ebde855","Type":"ContainerDied","Data":"6520928ca382d1fa8c7f31c7a4a56bb9ef5c1f75e1773e7d4bee5b61d704fa81"} Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.643101 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6520928ca382d1fa8c7f31c7a4a56bb9ef5c1f75e1773e7d4bee5b61d704fa81" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.643161 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.720971 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl"] Sep 30 14:30:18 crc kubenswrapper[4936]: E0930 14:30:18.721316 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb77378-b2f6-48a5-b156-0c983ebde855" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.721347 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb77378-b2f6-48a5-b156-0c983ebde855" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 14:30:18 crc kubenswrapper[4936]: E0930 14:30:18.721374 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b068d261-097f-44dd-af05-16be8300793e" containerName="collect-profiles" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.721380 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b068d261-097f-44dd-af05-16be8300793e" containerName="collect-profiles" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.721547 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b068d261-097f-44dd-af05-16be8300793e" containerName="collect-profiles" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.721563 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb77378-b2f6-48a5-b156-0c983ebde855" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.722154 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.724432 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.724507 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.724564 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.724613 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.724702 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.724781 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.724879 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.725096 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t6q49" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.727811 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.746440 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl"] Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.803574 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.803625 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.803645 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.803682 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.803723 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.803759 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.803881 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.803903 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/32b191a4-92aa-4f6a-998e-0877753b109d-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.803928 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/32b191a4-92aa-4f6a-998e-0877753b109d-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.803944 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.803959 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk8cm\" (UniqueName: \"kubernetes.io/projected/32b191a4-92aa-4f6a-998e-0877753b109d-kube-api-access-sk8cm\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.905803 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.905887 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.905932 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.905997 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.906014 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/32b191a4-92aa-4f6a-998e-0877753b109d-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.906040 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/32b191a4-92aa-4f6a-998e-0877753b109d-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.906058 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.906075 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk8cm\" (UniqueName: \"kubernetes.io/projected/32b191a4-92aa-4f6a-998e-0877753b109d-kube-api-access-sk8cm\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.906114 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.906149 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.906177 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.909806 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.910820 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.911462 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.911792 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.912148 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/32b191a4-92aa-4f6a-998e-0877753b109d-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.913277 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.913593 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.914144 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.914573 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/32b191a4-92aa-4f6a-998e-0877753b109d-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.915637 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:18 crc kubenswrapper[4936]: I0930 14:30:18.929204 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk8cm\" (UniqueName: \"kubernetes.io/projected/32b191a4-92aa-4f6a-998e-0877753b109d-kube-api-access-sk8cm\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:19 crc kubenswrapper[4936]: I0930 14:30:19.041791 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:30:19 crc kubenswrapper[4936]: I0930 14:30:19.601928 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl"] Sep 30 14:30:19 crc kubenswrapper[4936]: I0930 14:30:19.652268 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" event={"ID":"32b191a4-92aa-4f6a-998e-0877753b109d","Type":"ContainerStarted","Data":"95fefc5452dc8e623adefc08fbfde4c6ebcbecc375df5b16290794173d60c37a"} Sep 30 14:30:20 crc kubenswrapper[4936]: I0930 14:30:20.661240 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" event={"ID":"32b191a4-92aa-4f6a-998e-0877753b109d","Type":"ContainerStarted","Data":"274e8e765f5a6bde989e2de85e386c5e24aacc37bb9e5b56ac863aba63e0d852"} Sep 30 14:30:20 crc kubenswrapper[4936]: I0930 14:30:20.686948 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" podStartSLOduration=2.215591562 podStartE2EDuration="2.686927778s" podCreationTimestamp="2025-09-30 14:30:18 +0000 UTC" firstStartedPulling="2025-09-30 14:30:19.599273226 +0000 UTC m=+3069.983275517" lastFinishedPulling="2025-09-30 14:30:20.070609432 +0000 UTC m=+3070.454611733" observedRunningTime="2025-09-30 14:30:20.681690364 +0000 UTC m=+3071.065692685" watchObservedRunningTime="2025-09-30 14:30:20.686927778 +0000 UTC m=+3071.070930069" Sep 30 14:30:30 crc kubenswrapper[4936]: I0930 14:30:30.321456 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:30:30 crc kubenswrapper[4936]: E0930 14:30:30.322151 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:30:39 crc kubenswrapper[4936]: I0930 14:30:39.480706 4936 scope.go:117] "RemoveContainer" containerID="64480263703ac38729b088ebaca9d8ce6b5cdd547789790c6973bffb7cab0180" Sep 30 14:30:41 crc kubenswrapper[4936]: I0930 14:30:41.316799 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:30:41 crc kubenswrapper[4936]: E0930 14:30:41.317378 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:30:56 crc kubenswrapper[4936]: I0930 14:30:56.317114 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:30:56 crc kubenswrapper[4936]: E0930 14:30:56.318812 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:31:10 crc kubenswrapper[4936]: I0930 14:31:10.327435 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:31:10 crc kubenswrapper[4936]: E0930 14:31:10.328513 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:31:23 crc kubenswrapper[4936]: I0930 14:31:23.315972 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:31:23 crc kubenswrapper[4936]: E0930 14:31:23.316575 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:31:36 crc kubenswrapper[4936]: I0930 14:31:36.315131 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:31:36 crc kubenswrapper[4936]: E0930 14:31:36.315976 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:31:39 crc kubenswrapper[4936]: I0930 14:31:39.542565 4936 scope.go:117] "RemoveContainer" containerID="2c69dae458154284b9deb996f5e5f39ca06880c6a672be842c8fe8435c25d9f3" Sep 30 14:31:39 crc kubenswrapper[4936]: I0930 14:31:39.570153 4936 scope.go:117] "RemoveContainer" containerID="b0daf9bfbabb7b60aab668c0c6c7dc5c71579c6f5e2515317ab41256f3c2c962" Sep 30 14:31:39 crc kubenswrapper[4936]: I0930 14:31:39.620169 4936 scope.go:117] "RemoveContainer" containerID="35f9b88c1bb364c211748559913d58ef2601bc425f7dd08a241c176c51c36fec" Sep 30 14:31:47 crc kubenswrapper[4936]: I0930 14:31:47.317145 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:31:47 crc kubenswrapper[4936]: E0930 14:31:47.317954 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:32:02 crc kubenswrapper[4936]: I0930 14:32:02.317714 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:32:02 crc kubenswrapper[4936]: E0930 14:32:02.318521 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:32:14 crc kubenswrapper[4936]: I0930 14:32:14.315659 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:32:14 crc kubenswrapper[4936]: E0930 14:32:14.316497 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:32:26 crc kubenswrapper[4936]: I0930 14:32:26.315539 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:32:26 crc kubenswrapper[4936]: I0930 14:32:26.690126 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"9a8801a42a802f6d35109a7572a6ec617e571aeaa1ac23c1321a54407ac1c0a9"} Sep 30 14:33:49 crc kubenswrapper[4936]: I0930 14:33:49.127565 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zxc9h"] Sep 30 14:33:49 crc kubenswrapper[4936]: I0930 14:33:49.151002 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zxc9h" Sep 30 14:33:49 crc kubenswrapper[4936]: I0930 14:33:49.180397 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eb7a902-bc90-4fdb-bb53-e1d39428e737-utilities\") pod \"community-operators-zxc9h\" (UID: \"5eb7a902-bc90-4fdb-bb53-e1d39428e737\") " pod="openshift-marketplace/community-operators-zxc9h" Sep 30 14:33:49 crc kubenswrapper[4936]: I0930 14:33:49.180570 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntbnk\" (UniqueName: \"kubernetes.io/projected/5eb7a902-bc90-4fdb-bb53-e1d39428e737-kube-api-access-ntbnk\") pod \"community-operators-zxc9h\" (UID: \"5eb7a902-bc90-4fdb-bb53-e1d39428e737\") " pod="openshift-marketplace/community-operators-zxc9h" Sep 30 14:33:49 crc kubenswrapper[4936]: I0930 14:33:49.180630 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eb7a902-bc90-4fdb-bb53-e1d39428e737-catalog-content\") pod \"community-operators-zxc9h\" (UID: \"5eb7a902-bc90-4fdb-bb53-e1d39428e737\") " pod="openshift-marketplace/community-operators-zxc9h" Sep 30 14:33:49 crc kubenswrapper[4936]: I0930 14:33:49.196263 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zxc9h"] Sep 30 14:33:49 crc kubenswrapper[4936]: I0930 14:33:49.282136 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntbnk\" (UniqueName: \"kubernetes.io/projected/5eb7a902-bc90-4fdb-bb53-e1d39428e737-kube-api-access-ntbnk\") pod \"community-operators-zxc9h\" (UID: \"5eb7a902-bc90-4fdb-bb53-e1d39428e737\") " pod="openshift-marketplace/community-operators-zxc9h" Sep 30 14:33:49 crc kubenswrapper[4936]: I0930 14:33:49.282213 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eb7a902-bc90-4fdb-bb53-e1d39428e737-catalog-content\") pod \"community-operators-zxc9h\" (UID: \"5eb7a902-bc90-4fdb-bb53-e1d39428e737\") " pod="openshift-marketplace/community-operators-zxc9h" Sep 30 14:33:49 crc kubenswrapper[4936]: I0930 14:33:49.282252 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eb7a902-bc90-4fdb-bb53-e1d39428e737-utilities\") pod \"community-operators-zxc9h\" (UID: \"5eb7a902-bc90-4fdb-bb53-e1d39428e737\") " pod="openshift-marketplace/community-operators-zxc9h" Sep 30 14:33:49 crc kubenswrapper[4936]: I0930 14:33:49.282834 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eb7a902-bc90-4fdb-bb53-e1d39428e737-catalog-content\") pod \"community-operators-zxc9h\" (UID: \"5eb7a902-bc90-4fdb-bb53-e1d39428e737\") " pod="openshift-marketplace/community-operators-zxc9h" Sep 30 14:33:49 crc kubenswrapper[4936]: I0930 14:33:49.282910 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eb7a902-bc90-4fdb-bb53-e1d39428e737-utilities\") pod \"community-operators-zxc9h\" (UID: \"5eb7a902-bc90-4fdb-bb53-e1d39428e737\") " pod="openshift-marketplace/community-operators-zxc9h" Sep 30 14:33:49 crc kubenswrapper[4936]: I0930 14:33:49.302186 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntbnk\" (UniqueName: \"kubernetes.io/projected/5eb7a902-bc90-4fdb-bb53-e1d39428e737-kube-api-access-ntbnk\") pod \"community-operators-zxc9h\" (UID: \"5eb7a902-bc90-4fdb-bb53-e1d39428e737\") " pod="openshift-marketplace/community-operators-zxc9h" Sep 30 14:33:49 crc kubenswrapper[4936]: I0930 14:33:49.496517 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zxc9h" Sep 30 14:33:50 crc kubenswrapper[4936]: I0930 14:33:50.119488 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zxc9h"] Sep 30 14:33:50 crc kubenswrapper[4936]: W0930 14:33:50.133866 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5eb7a902_bc90_4fdb_bb53_e1d39428e737.slice/crio-0d4dc43b6703b5cbb02e4256867af27d741ec4f3a391ce07506b0c4b519c8928 WatchSource:0}: Error finding container 0d4dc43b6703b5cbb02e4256867af27d741ec4f3a391ce07506b0c4b519c8928: Status 404 returned error can't find the container with id 0d4dc43b6703b5cbb02e4256867af27d741ec4f3a391ce07506b0c4b519c8928 Sep 30 14:33:50 crc kubenswrapper[4936]: I0930 14:33:50.400659 4936 generic.go:334] "Generic (PLEG): container finished" podID="5eb7a902-bc90-4fdb-bb53-e1d39428e737" containerID="316fc641d60c0ad51810143fbe03b645001d37dd87e953df32e23683db403f79" exitCode=0 Sep 30 14:33:50 crc kubenswrapper[4936]: I0930 14:33:50.400716 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxc9h" event={"ID":"5eb7a902-bc90-4fdb-bb53-e1d39428e737","Type":"ContainerDied","Data":"316fc641d60c0ad51810143fbe03b645001d37dd87e953df32e23683db403f79"} Sep 30 14:33:50 crc kubenswrapper[4936]: I0930 14:33:50.400747 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxc9h" event={"ID":"5eb7a902-bc90-4fdb-bb53-e1d39428e737","Type":"ContainerStarted","Data":"0d4dc43b6703b5cbb02e4256867af27d741ec4f3a391ce07506b0c4b519c8928"} Sep 30 14:33:50 crc kubenswrapper[4936]: I0930 14:33:50.403310 4936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:33:52 crc kubenswrapper[4936]: I0930 14:33:52.423950 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxc9h" event={"ID":"5eb7a902-bc90-4fdb-bb53-e1d39428e737","Type":"ContainerStarted","Data":"41c0c724f506a8a7d0902294a1e8381fd7f8607534bf6437d98cbc130817e7b6"} Sep 30 14:33:53 crc kubenswrapper[4936]: I0930 14:33:53.433708 4936 generic.go:334] "Generic (PLEG): container finished" podID="5eb7a902-bc90-4fdb-bb53-e1d39428e737" containerID="41c0c724f506a8a7d0902294a1e8381fd7f8607534bf6437d98cbc130817e7b6" exitCode=0 Sep 30 14:33:53 crc kubenswrapper[4936]: I0930 14:33:53.433795 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxc9h" event={"ID":"5eb7a902-bc90-4fdb-bb53-e1d39428e737","Type":"ContainerDied","Data":"41c0c724f506a8a7d0902294a1e8381fd7f8607534bf6437d98cbc130817e7b6"} Sep 30 14:33:54 crc kubenswrapper[4936]: I0930 14:33:54.446493 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxc9h" event={"ID":"5eb7a902-bc90-4fdb-bb53-e1d39428e737","Type":"ContainerStarted","Data":"12941a2bca90bc79d95e1027b1f9b284268570b0ae70e9d33bb0940f691ee76e"} Sep 30 14:33:54 crc kubenswrapper[4936]: I0930 14:33:54.473325 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zxc9h" podStartSLOduration=2.064623621 podStartE2EDuration="5.473301254s" podCreationTimestamp="2025-09-30 14:33:49 +0000 UTC" firstStartedPulling="2025-09-30 14:33:50.402874121 +0000 UTC m=+3280.786876422" lastFinishedPulling="2025-09-30 14:33:53.811551754 +0000 UTC m=+3284.195554055" observedRunningTime="2025-09-30 14:33:54.467632589 +0000 UTC m=+3284.851634900" watchObservedRunningTime="2025-09-30 14:33:54.473301254 +0000 UTC m=+3284.857303555" Sep 30 14:33:59 crc kubenswrapper[4936]: I0930 14:33:59.497261 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zxc9h" Sep 30 14:33:59 crc kubenswrapper[4936]: I0930 14:33:59.497857 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zxc9h" Sep 30 14:33:59 crc kubenswrapper[4936]: I0930 14:33:59.549782 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zxc9h" Sep 30 14:34:00 crc kubenswrapper[4936]: I0930 14:34:00.543630 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zxc9h" Sep 30 14:34:00 crc kubenswrapper[4936]: I0930 14:34:00.589129 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zxc9h"] Sep 30 14:34:02 crc kubenswrapper[4936]: I0930 14:34:02.510704 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zxc9h" podUID="5eb7a902-bc90-4fdb-bb53-e1d39428e737" containerName="registry-server" containerID="cri-o://12941a2bca90bc79d95e1027b1f9b284268570b0ae70e9d33bb0940f691ee76e" gracePeriod=2 Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.001326 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zxc9h" Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.157967 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntbnk\" (UniqueName: \"kubernetes.io/projected/5eb7a902-bc90-4fdb-bb53-e1d39428e737-kube-api-access-ntbnk\") pod \"5eb7a902-bc90-4fdb-bb53-e1d39428e737\" (UID: \"5eb7a902-bc90-4fdb-bb53-e1d39428e737\") " Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.158124 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eb7a902-bc90-4fdb-bb53-e1d39428e737-catalog-content\") pod \"5eb7a902-bc90-4fdb-bb53-e1d39428e737\" (UID: \"5eb7a902-bc90-4fdb-bb53-e1d39428e737\") " Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.160319 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eb7a902-bc90-4fdb-bb53-e1d39428e737-utilities\") pod \"5eb7a902-bc90-4fdb-bb53-e1d39428e737\" (UID: \"5eb7a902-bc90-4fdb-bb53-e1d39428e737\") " Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.163296 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5eb7a902-bc90-4fdb-bb53-e1d39428e737-utilities" (OuterVolumeSpecName: "utilities") pod "5eb7a902-bc90-4fdb-bb53-e1d39428e737" (UID: "5eb7a902-bc90-4fdb-bb53-e1d39428e737"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.163724 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eb7a902-bc90-4fdb-bb53-e1d39428e737-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.178100 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eb7a902-bc90-4fdb-bb53-e1d39428e737-kube-api-access-ntbnk" (OuterVolumeSpecName: "kube-api-access-ntbnk") pod "5eb7a902-bc90-4fdb-bb53-e1d39428e737" (UID: "5eb7a902-bc90-4fdb-bb53-e1d39428e737"). InnerVolumeSpecName "kube-api-access-ntbnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.231922 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5eb7a902-bc90-4fdb-bb53-e1d39428e737-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5eb7a902-bc90-4fdb-bb53-e1d39428e737" (UID: "5eb7a902-bc90-4fdb-bb53-e1d39428e737"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.265716 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntbnk\" (UniqueName: \"kubernetes.io/projected/5eb7a902-bc90-4fdb-bb53-e1d39428e737-kube-api-access-ntbnk\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.265757 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eb7a902-bc90-4fdb-bb53-e1d39428e737-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.525495 4936 generic.go:334] "Generic (PLEG): container finished" podID="5eb7a902-bc90-4fdb-bb53-e1d39428e737" containerID="12941a2bca90bc79d95e1027b1f9b284268570b0ae70e9d33bb0940f691ee76e" exitCode=0 Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.525578 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxc9h" event={"ID":"5eb7a902-bc90-4fdb-bb53-e1d39428e737","Type":"ContainerDied","Data":"12941a2bca90bc79d95e1027b1f9b284268570b0ae70e9d33bb0940f691ee76e"} Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.525629 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxc9h" event={"ID":"5eb7a902-bc90-4fdb-bb53-e1d39428e737","Type":"ContainerDied","Data":"0d4dc43b6703b5cbb02e4256867af27d741ec4f3a391ce07506b0c4b519c8928"} Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.525655 4936 scope.go:117] "RemoveContainer" containerID="12941a2bca90bc79d95e1027b1f9b284268570b0ae70e9d33bb0940f691ee76e" Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.525586 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zxc9h" Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.572725 4936 scope.go:117] "RemoveContainer" containerID="41c0c724f506a8a7d0902294a1e8381fd7f8607534bf6437d98cbc130817e7b6" Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.577684 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zxc9h"] Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.597009 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zxc9h"] Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.607145 4936 scope.go:117] "RemoveContainer" containerID="316fc641d60c0ad51810143fbe03b645001d37dd87e953df32e23683db403f79" Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.719554 4936 scope.go:117] "RemoveContainer" containerID="12941a2bca90bc79d95e1027b1f9b284268570b0ae70e9d33bb0940f691ee76e" Sep 30 14:34:03 crc kubenswrapper[4936]: E0930 14:34:03.720594 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12941a2bca90bc79d95e1027b1f9b284268570b0ae70e9d33bb0940f691ee76e\": container with ID starting with 12941a2bca90bc79d95e1027b1f9b284268570b0ae70e9d33bb0940f691ee76e not found: ID does not exist" containerID="12941a2bca90bc79d95e1027b1f9b284268570b0ae70e9d33bb0940f691ee76e" Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.720647 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12941a2bca90bc79d95e1027b1f9b284268570b0ae70e9d33bb0940f691ee76e"} err="failed to get container status \"12941a2bca90bc79d95e1027b1f9b284268570b0ae70e9d33bb0940f691ee76e\": rpc error: code = NotFound desc = could not find container \"12941a2bca90bc79d95e1027b1f9b284268570b0ae70e9d33bb0940f691ee76e\": container with ID starting with 12941a2bca90bc79d95e1027b1f9b284268570b0ae70e9d33bb0940f691ee76e not found: ID does not exist" Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.720682 4936 scope.go:117] "RemoveContainer" containerID="41c0c724f506a8a7d0902294a1e8381fd7f8607534bf6437d98cbc130817e7b6" Sep 30 14:34:03 crc kubenswrapper[4936]: E0930 14:34:03.721246 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41c0c724f506a8a7d0902294a1e8381fd7f8607534bf6437d98cbc130817e7b6\": container with ID starting with 41c0c724f506a8a7d0902294a1e8381fd7f8607534bf6437d98cbc130817e7b6 not found: ID does not exist" containerID="41c0c724f506a8a7d0902294a1e8381fd7f8607534bf6437d98cbc130817e7b6" Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.721300 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c0c724f506a8a7d0902294a1e8381fd7f8607534bf6437d98cbc130817e7b6"} err="failed to get container status \"41c0c724f506a8a7d0902294a1e8381fd7f8607534bf6437d98cbc130817e7b6\": rpc error: code = NotFound desc = could not find container \"41c0c724f506a8a7d0902294a1e8381fd7f8607534bf6437d98cbc130817e7b6\": container with ID starting with 41c0c724f506a8a7d0902294a1e8381fd7f8607534bf6437d98cbc130817e7b6 not found: ID does not exist" Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.721563 4936 scope.go:117] "RemoveContainer" containerID="316fc641d60c0ad51810143fbe03b645001d37dd87e953df32e23683db403f79" Sep 30 14:34:03 crc kubenswrapper[4936]: E0930 14:34:03.721914 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"316fc641d60c0ad51810143fbe03b645001d37dd87e953df32e23683db403f79\": container with ID starting with 316fc641d60c0ad51810143fbe03b645001d37dd87e953df32e23683db403f79 not found: ID does not exist" containerID="316fc641d60c0ad51810143fbe03b645001d37dd87e953df32e23683db403f79" Sep 30 14:34:03 crc kubenswrapper[4936]: I0930 14:34:03.721957 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316fc641d60c0ad51810143fbe03b645001d37dd87e953df32e23683db403f79"} err="failed to get container status \"316fc641d60c0ad51810143fbe03b645001d37dd87e953df32e23683db403f79\": rpc error: code = NotFound desc = could not find container \"316fc641d60c0ad51810143fbe03b645001d37dd87e953df32e23683db403f79\": container with ID starting with 316fc641d60c0ad51810143fbe03b645001d37dd87e953df32e23683db403f79 not found: ID does not exist" Sep 30 14:34:04 crc kubenswrapper[4936]: I0930 14:34:04.328593 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eb7a902-bc90-4fdb-bb53-e1d39428e737" path="/var/lib/kubelet/pods/5eb7a902-bc90-4fdb-bb53-e1d39428e737/volumes" Sep 30 14:34:04 crc kubenswrapper[4936]: I0930 14:34:04.542010 4936 generic.go:334] "Generic (PLEG): container finished" podID="32b191a4-92aa-4f6a-998e-0877753b109d" containerID="274e8e765f5a6bde989e2de85e386c5e24aacc37bb9e5b56ac863aba63e0d852" exitCode=0 Sep 30 14:34:04 crc kubenswrapper[4936]: I0930 14:34:04.542076 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" event={"ID":"32b191a4-92aa-4f6a-998e-0877753b109d","Type":"ContainerDied","Data":"274e8e765f5a6bde989e2de85e386c5e24aacc37bb9e5b56ac863aba63e0d852"} Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.022181 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.133021 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-ceph\") pod \"32b191a4-92aa-4f6a-998e-0877753b109d\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.133628 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-custom-ceph-combined-ca-bundle\") pod \"32b191a4-92aa-4f6a-998e-0877753b109d\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.133679 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-cell1-compute-config-1\") pod \"32b191a4-92aa-4f6a-998e-0877753b109d\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.133717 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-inventory\") pod \"32b191a4-92aa-4f6a-998e-0877753b109d\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.133789 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-cell1-compute-config-0\") pod \"32b191a4-92aa-4f6a-998e-0877753b109d\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.133873 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/32b191a4-92aa-4f6a-998e-0877753b109d-nova-extra-config-0\") pod \"32b191a4-92aa-4f6a-998e-0877753b109d\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.133963 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/32b191a4-92aa-4f6a-998e-0877753b109d-ceph-nova-0\") pod \"32b191a4-92aa-4f6a-998e-0877753b109d\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.134008 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-ssh-key\") pod \"32b191a4-92aa-4f6a-998e-0877753b109d\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.134077 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-migration-ssh-key-1\") pod \"32b191a4-92aa-4f6a-998e-0877753b109d\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.134112 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-migration-ssh-key-0\") pod \"32b191a4-92aa-4f6a-998e-0877753b109d\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.134168 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk8cm\" (UniqueName: \"kubernetes.io/projected/32b191a4-92aa-4f6a-998e-0877753b109d-kube-api-access-sk8cm\") pod \"32b191a4-92aa-4f6a-998e-0877753b109d\" (UID: \"32b191a4-92aa-4f6a-998e-0877753b109d\") " Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.141519 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-ceph" (OuterVolumeSpecName: "ceph") pod "32b191a4-92aa-4f6a-998e-0877753b109d" (UID: "32b191a4-92aa-4f6a-998e-0877753b109d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.141879 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32b191a4-92aa-4f6a-998e-0877753b109d-kube-api-access-sk8cm" (OuterVolumeSpecName: "kube-api-access-sk8cm") pod "32b191a4-92aa-4f6a-998e-0877753b109d" (UID: "32b191a4-92aa-4f6a-998e-0877753b109d"). InnerVolumeSpecName "kube-api-access-sk8cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.142470 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "32b191a4-92aa-4f6a-998e-0877753b109d" (UID: "32b191a4-92aa-4f6a-998e-0877753b109d"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.169706 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32b191a4-92aa-4f6a-998e-0877753b109d-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "32b191a4-92aa-4f6a-998e-0877753b109d" (UID: "32b191a4-92aa-4f6a-998e-0877753b109d"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.177205 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32b191a4-92aa-4f6a-998e-0877753b109d-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "32b191a4-92aa-4f6a-998e-0877753b109d" (UID: "32b191a4-92aa-4f6a-998e-0877753b109d"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.178847 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-inventory" (OuterVolumeSpecName: "inventory") pod "32b191a4-92aa-4f6a-998e-0877753b109d" (UID: "32b191a4-92aa-4f6a-998e-0877753b109d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.185418 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "32b191a4-92aa-4f6a-998e-0877753b109d" (UID: "32b191a4-92aa-4f6a-998e-0877753b109d"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.186220 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "32b191a4-92aa-4f6a-998e-0877753b109d" (UID: "32b191a4-92aa-4f6a-998e-0877753b109d"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.191165 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "32b191a4-92aa-4f6a-998e-0877753b109d" (UID: "32b191a4-92aa-4f6a-998e-0877753b109d"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.198693 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "32b191a4-92aa-4f6a-998e-0877753b109d" (UID: "32b191a4-92aa-4f6a-998e-0877753b109d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.210632 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "32b191a4-92aa-4f6a-998e-0877753b109d" (UID: "32b191a4-92aa-4f6a-998e-0877753b109d"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.236363 4936 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.236414 4936 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.236426 4936 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.236439 4936 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/32b191a4-92aa-4f6a-998e-0877753b109d-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.236450 4936 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/32b191a4-92aa-4f6a-998e-0877753b109d-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.236460 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.236470 4936 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.236481 4936 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.236491 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk8cm\" (UniqueName: \"kubernetes.io/projected/32b191a4-92aa-4f6a-998e-0877753b109d-kube-api-access-sk8cm\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.236501 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.236513 4936 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b191a4-92aa-4f6a-998e-0877753b109d-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.596175 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" event={"ID":"32b191a4-92aa-4f6a-998e-0877753b109d","Type":"ContainerDied","Data":"95fefc5452dc8e623adefc08fbfde4c6ebcbecc375df5b16290794173d60c37a"} Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.596249 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95fefc5452dc8e623adefc08fbfde4c6ebcbecc375df5b16290794173d60c37a" Sep 30 14:34:06 crc kubenswrapper[4936]: I0930 14:34:06.596471 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.545413 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Sep 30 14:34:21 crc kubenswrapper[4936]: E0930 14:34:21.546451 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7a902-bc90-4fdb-bb53-e1d39428e737" containerName="extract-utilities" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.546468 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7a902-bc90-4fdb-bb53-e1d39428e737" containerName="extract-utilities" Sep 30 14:34:21 crc kubenswrapper[4936]: E0930 14:34:21.546490 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7a902-bc90-4fdb-bb53-e1d39428e737" containerName="registry-server" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.546498 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7a902-bc90-4fdb-bb53-e1d39428e737" containerName="registry-server" Sep 30 14:34:21 crc kubenswrapper[4936]: E0930 14:34:21.546512 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b191a4-92aa-4f6a-998e-0877753b109d" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.546523 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b191a4-92aa-4f6a-998e-0877753b109d" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Sep 30 14:34:21 crc kubenswrapper[4936]: E0930 14:34:21.546536 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7a902-bc90-4fdb-bb53-e1d39428e737" containerName="extract-content" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.546544 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7a902-bc90-4fdb-bb53-e1d39428e737" containerName="extract-content" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.546754 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7a902-bc90-4fdb-bb53-e1d39428e737" containerName="registry-server" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.546771 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b191a4-92aa-4f6a-998e-0877753b109d" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.548022 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.551375 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.554146 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.561454 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.563470 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.578574 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.588001 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.613504 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.631388 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.631727 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.631843 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4852707-8aff-49ed-b929-3bdcf9cd921a-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.631947 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.632053 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.632181 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4852707-8aff-49ed-b929-3bdcf9cd921a-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.632280 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d6eb814-5e27-493b-b63e-e8eddf561330-config-data-custom\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.632410 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6eb814-5e27-493b-b63e-e8eddf561330-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.632533 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4852707-8aff-49ed-b929-3bdcf9cd921a-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.632638 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.632747 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-lib-modules\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.632886 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-run\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.633000 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2d6eb814-5e27-493b-b63e-e8eddf561330-ceph\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.633114 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-dev\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.633212 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-etc-nvme\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.633307 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d6eb814-5e27-493b-b63e-e8eddf561330-scripts\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.633489 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.633605 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.633711 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6eb814-5e27-493b-b63e-e8eddf561330-config-data\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.633803 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-run\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.633895 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.634000 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.634215 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-sys\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.634372 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-dev\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.634485 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4852707-8aff-49ed-b929-3bdcf9cd921a-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.634584 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.634688 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-sys\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.634796 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c4852707-8aff-49ed-b929-3bdcf9cd921a-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.634911 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcvkg\" (UniqueName: \"kubernetes.io/projected/2d6eb814-5e27-493b-b63e-e8eddf561330-kube-api-access-dcvkg\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.635055 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.639976 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.640125 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm9kn\" (UniqueName: \"kubernetes.io/projected/c4852707-8aff-49ed-b929-3bdcf9cd921a-kube-api-access-sm9kn\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742078 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-dev\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742121 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-etc-nvme\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742181 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d6eb814-5e27-493b-b63e-e8eddf561330-scripts\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742216 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742245 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742266 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742264 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-dev\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742283 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6eb814-5e27-493b-b63e-e8eddf561330-config-data\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742298 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-run\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742346 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742364 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-sys\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742387 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-dev\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742401 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4852707-8aff-49ed-b929-3bdcf9cd921a-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742418 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742435 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-sys\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742455 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c4852707-8aff-49ed-b929-3bdcf9cd921a-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742474 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcvkg\" (UniqueName: \"kubernetes.io/projected/2d6eb814-5e27-493b-b63e-e8eddf561330-kube-api-access-dcvkg\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742508 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742532 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742548 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm9kn\" (UniqueName: \"kubernetes.io/projected/c4852707-8aff-49ed-b929-3bdcf9cd921a-kube-api-access-sm9kn\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742565 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742594 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742609 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4852707-8aff-49ed-b929-3bdcf9cd921a-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742629 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742647 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742664 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4852707-8aff-49ed-b929-3bdcf9cd921a-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742683 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d6eb814-5e27-493b-b63e-e8eddf561330-config-data-custom\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742704 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6eb814-5e27-493b-b63e-e8eddf561330-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742725 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4852707-8aff-49ed-b929-3bdcf9cd921a-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742746 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742763 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-lib-modules\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742802 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-run\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742819 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2d6eb814-5e27-493b-b63e-e8eddf561330-ceph\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.742888 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-etc-nvme\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.743052 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.743803 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.743875 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-run\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.743901 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.743922 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-sys\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.743943 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-dev\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.744087 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-sys\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.744502 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.744553 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-lib-modules\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.744589 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.744629 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.744664 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.744865 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.744885 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.744923 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.744950 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-run\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.744973 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c4852707-8aff-49ed-b929-3bdcf9cd921a-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.745128 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2d6eb814-5e27-493b-b63e-e8eddf561330-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.748523 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d6eb814-5e27-493b-b63e-e8eddf561330-scripts\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.748727 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4852707-8aff-49ed-b929-3bdcf9cd921a-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.752695 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d6eb814-5e27-493b-b63e-e8eddf561330-config-data-custom\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.753582 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4852707-8aff-49ed-b929-3bdcf9cd921a-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.755211 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6eb814-5e27-493b-b63e-e8eddf561330-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.758024 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c4852707-8aff-49ed-b929-3bdcf9cd921a-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.758152 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2d6eb814-5e27-493b-b63e-e8eddf561330-ceph\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.762469 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6eb814-5e27-493b-b63e-e8eddf561330-config-data\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.768883 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4852707-8aff-49ed-b929-3bdcf9cd921a-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.769717 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4852707-8aff-49ed-b929-3bdcf9cd921a-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.775322 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm9kn\" (UniqueName: \"kubernetes.io/projected/c4852707-8aff-49ed-b929-3bdcf9cd921a-kube-api-access-sm9kn\") pod \"cinder-volume-volume1-0\" (UID: \"c4852707-8aff-49ed-b929-3bdcf9cd921a\") " pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.779915 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcvkg\" (UniqueName: \"kubernetes.io/projected/2d6eb814-5e27-493b-b63e-e8eddf561330-kube-api-access-dcvkg\") pod \"cinder-backup-0\" (UID: \"2d6eb814-5e27-493b-b63e-e8eddf561330\") " pod="openstack/cinder-backup-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.868895 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:21 crc kubenswrapper[4936]: I0930 14:34:21.881312 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.382488 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.389035 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.398759 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.399069 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.399213 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.399397 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sxmps" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.418107 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-vnnf4"] Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.425643 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-vnnf4" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.436526 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.511372 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-vnnf4"] Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.513802 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hfnc\" (UniqueName: \"kubernetes.io/projected/9100f851-91ef-4d1d-8346-047e97aef7c2-kube-api-access-2hfnc\") pod \"manila-db-create-vnnf4\" (UID: \"9100f851-91ef-4d1d-8346-047e97aef7c2\") " pod="openstack/manila-db-create-vnnf4" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.617540 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-scripts\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.617597 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12dae184-f6bd-48ed-ba76-de2c2679c959-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.617640 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12dae184-f6bd-48ed-ba76-de2c2679c959-logs\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.617679 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.617748 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pznb\" (UniqueName: \"kubernetes.io/projected/12dae184-f6bd-48ed-ba76-de2c2679c959-kube-api-access-5pznb\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.617772 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/12dae184-f6bd-48ed-ba76-de2c2679c959-ceph\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.617821 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hfnc\" (UniqueName: \"kubernetes.io/projected/9100f851-91ef-4d1d-8346-047e97aef7c2-kube-api-access-2hfnc\") pod \"manila-db-create-vnnf4\" (UID: \"9100f851-91ef-4d1d-8346-047e97aef7c2\") " pod="openstack/manila-db-create-vnnf4" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.617839 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-config-data\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.617857 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.617873 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.618947 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.621244 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.631761 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.632086 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.658503 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hfnc\" (UniqueName: \"kubernetes.io/projected/9100f851-91ef-4d1d-8346-047e97aef7c2-kube-api-access-2hfnc\") pod \"manila-db-create-vnnf4\" (UID: \"9100f851-91ef-4d1d-8346-047e97aef7c2\") " pod="openstack/manila-db-create-vnnf4" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.667787 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.687599 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-79464b7c69-cqz8d"] Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.698107 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79464b7c69-cqz8d" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.702155 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.702431 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.702565 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.702682 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-c4m5z" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.717393 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79464b7c69-cqz8d"] Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.725609 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.725843 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.726186 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pznb\" (UniqueName: \"kubernetes.io/projected/12dae184-f6bd-48ed-ba76-de2c2679c959-kube-api-access-5pznb\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.726251 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/12dae184-f6bd-48ed-ba76-de2c2679c959-ceph\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.726734 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.726769 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgmfl\" (UniqueName: \"kubernetes.io/projected/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-kube-api-access-jgmfl\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.726803 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.726842 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-config-data\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.726859 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.726879 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.726918 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-scripts\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.726955 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12dae184-f6bd-48ed-ba76-de2c2679c959-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.726987 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.727035 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.727064 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.727086 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12dae184-f6bd-48ed-ba76-de2c2679c959-logs\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.727158 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.727184 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.731900 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12dae184-f6bd-48ed-ba76-de2c2679c959-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.732170 4936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.732572 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.732576 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-scripts\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.736837 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:34:22 crc kubenswrapper[4936]: E0930 14:34:22.737908 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceph config-data glance kube-api-access-5pznb logs public-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-external-api-0" podUID="12dae184-f6bd-48ed-ba76-de2c2679c959" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.739680 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-config-data\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.740522 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12dae184-f6bd-48ed-ba76-de2c2679c959-logs\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.742561 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/12dae184-f6bd-48ed-ba76-de2c2679c959-ceph\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.758060 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.789424 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pznb\" (UniqueName: \"kubernetes.io/projected/12dae184-f6bd-48ed-ba76-de2c2679c959-kube-api-access-5pznb\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.791674 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.828430 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.828490 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fcc04d7d-2a45-41e7-8c63-883df94fcd08-horizon-secret-key\") pod \"horizon-79464b7c69-cqz8d\" (UID: \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\") " pod="openstack/horizon-79464b7c69-cqz8d" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.828533 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.828565 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-756tm\" (UniqueName: \"kubernetes.io/projected/fcc04d7d-2a45-41e7-8c63-883df94fcd08-kube-api-access-756tm\") pod \"horizon-79464b7c69-cqz8d\" (UID: \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\") " pod="openstack/horizon-79464b7c69-cqz8d" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.828605 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.828627 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgmfl\" (UniqueName: \"kubernetes.io/projected/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-kube-api-access-jgmfl\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.828648 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.828715 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.828746 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.828767 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.828789 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcc04d7d-2a45-41e7-8c63-883df94fcd08-logs\") pod \"horizon-79464b7c69-cqz8d\" (UID: \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\") " pod="openstack/horizon-79464b7c69-cqz8d" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.828818 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fcc04d7d-2a45-41e7-8c63-883df94fcd08-config-data\") pod \"horizon-79464b7c69-cqz8d\" (UID: \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\") " pod="openstack/horizon-79464b7c69-cqz8d" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.828841 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcc04d7d-2a45-41e7-8c63-883df94fcd08-scripts\") pod \"horizon-79464b7c69-cqz8d\" (UID: \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\") " pod="openstack/horizon-79464b7c69-cqz8d" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.828857 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.831787 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.844224 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.844448 4936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.844977 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.849951 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.851749 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-vnnf4" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.853631 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.870223 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.932553 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-756tm\" (UniqueName: \"kubernetes.io/projected/fcc04d7d-2a45-41e7-8c63-883df94fcd08-kube-api-access-756tm\") pod \"horizon-79464b7c69-cqz8d\" (UID: \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\") " pod="openstack/horizon-79464b7c69-cqz8d" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.932741 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcc04d7d-2a45-41e7-8c63-883df94fcd08-logs\") pod \"horizon-79464b7c69-cqz8d\" (UID: \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\") " pod="openstack/horizon-79464b7c69-cqz8d" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.932787 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fcc04d7d-2a45-41e7-8c63-883df94fcd08-config-data\") pod \"horizon-79464b7c69-cqz8d\" (UID: \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\") " pod="openstack/horizon-79464b7c69-cqz8d" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.932825 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcc04d7d-2a45-41e7-8c63-883df94fcd08-scripts\") pod \"horizon-79464b7c69-cqz8d\" (UID: \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\") " pod="openstack/horizon-79464b7c69-cqz8d" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.932867 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fcc04d7d-2a45-41e7-8c63-883df94fcd08-horizon-secret-key\") pod \"horizon-79464b7c69-cqz8d\" (UID: \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\") " pod="openstack/horizon-79464b7c69-cqz8d" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.936847 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcc04d7d-2a45-41e7-8c63-883df94fcd08-logs\") pod \"horizon-79464b7c69-cqz8d\" (UID: \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\") " pod="openstack/horizon-79464b7c69-cqz8d" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.938187 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fcc04d7d-2a45-41e7-8c63-883df94fcd08-config-data\") pod \"horizon-79464b7c69-cqz8d\" (UID: \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\") " pod="openstack/horizon-79464b7c69-cqz8d" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.950753 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcc04d7d-2a45-41e7-8c63-883df94fcd08-scripts\") pod \"horizon-79464b7c69-cqz8d\" (UID: \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\") " pod="openstack/horizon-79464b7c69-cqz8d" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.972736 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fcc04d7d-2a45-41e7-8c63-883df94fcd08-horizon-secret-key\") pod \"horizon-79464b7c69-cqz8d\" (UID: \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\") " pod="openstack/horizon-79464b7c69-cqz8d" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.991212 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.994136 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgmfl\" (UniqueName: \"kubernetes.io/projected/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-kube-api-access-jgmfl\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.997881 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:34:22 crc kubenswrapper[4936]: I0930 14:34:22.998551 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:22 crc kubenswrapper[4936]: E0930 14:34:22.999030 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="5ac80a48-e22c-4ce3-ba9c-fac743b80b7a" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.027091 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-756tm\" (UniqueName: \"kubernetes.io/projected/fcc04d7d-2a45-41e7-8c63-883df94fcd08-kube-api-access-756tm\") pod \"horizon-79464b7c69-cqz8d\" (UID: \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\") " pod="openstack/horizon-79464b7c69-cqz8d" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.027166 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-76ccbbb9dc-7twmk"] Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.029010 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76ccbbb9dc-7twmk" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.044105 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79464b7c69-cqz8d" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.050129 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76ccbbb9dc-7twmk"] Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.137877 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/357dcb07-796d-420d-b883-5304f74c724c-config-data\") pod \"horizon-76ccbbb9dc-7twmk\" (UID: \"357dcb07-796d-420d-b883-5304f74c724c\") " pod="openstack/horizon-76ccbbb9dc-7twmk" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.137928 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/357dcb07-796d-420d-b883-5304f74c724c-horizon-secret-key\") pod \"horizon-76ccbbb9dc-7twmk\" (UID: \"357dcb07-796d-420d-b883-5304f74c724c\") " pod="openstack/horizon-76ccbbb9dc-7twmk" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.137982 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b275z\" (UniqueName: \"kubernetes.io/projected/357dcb07-796d-420d-b883-5304f74c724c-kube-api-access-b275z\") pod \"horizon-76ccbbb9dc-7twmk\" (UID: \"357dcb07-796d-420d-b883-5304f74c724c\") " pod="openstack/horizon-76ccbbb9dc-7twmk" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.138027 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/357dcb07-796d-420d-b883-5304f74c724c-scripts\") pod \"horizon-76ccbbb9dc-7twmk\" (UID: \"357dcb07-796d-420d-b883-5304f74c724c\") " pod="openstack/horizon-76ccbbb9dc-7twmk" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.138049 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/357dcb07-796d-420d-b883-5304f74c724c-logs\") pod \"horizon-76ccbbb9dc-7twmk\" (UID: \"357dcb07-796d-420d-b883-5304f74c724c\") " pod="openstack/horizon-76ccbbb9dc-7twmk" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.242998 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/357dcb07-796d-420d-b883-5304f74c724c-scripts\") pod \"horizon-76ccbbb9dc-7twmk\" (UID: \"357dcb07-796d-420d-b883-5304f74c724c\") " pod="openstack/horizon-76ccbbb9dc-7twmk" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.244147 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/357dcb07-796d-420d-b883-5304f74c724c-logs\") pod \"horizon-76ccbbb9dc-7twmk\" (UID: \"357dcb07-796d-420d-b883-5304f74c724c\") " pod="openstack/horizon-76ccbbb9dc-7twmk" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.244964 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/357dcb07-796d-420d-b883-5304f74c724c-config-data\") pod \"horizon-76ccbbb9dc-7twmk\" (UID: \"357dcb07-796d-420d-b883-5304f74c724c\") " pod="openstack/horizon-76ccbbb9dc-7twmk" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.245010 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/357dcb07-796d-420d-b883-5304f74c724c-scripts\") pod \"horizon-76ccbbb9dc-7twmk\" (UID: \"357dcb07-796d-420d-b883-5304f74c724c\") " pod="openstack/horizon-76ccbbb9dc-7twmk" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.246324 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/357dcb07-796d-420d-b883-5304f74c724c-horizon-secret-key\") pod \"horizon-76ccbbb9dc-7twmk\" (UID: \"357dcb07-796d-420d-b883-5304f74c724c\") " pod="openstack/horizon-76ccbbb9dc-7twmk" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.246597 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b275z\" (UniqueName: \"kubernetes.io/projected/357dcb07-796d-420d-b883-5304f74c724c-kube-api-access-b275z\") pod \"horizon-76ccbbb9dc-7twmk\" (UID: \"357dcb07-796d-420d-b883-5304f74c724c\") " pod="openstack/horizon-76ccbbb9dc-7twmk" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.248371 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/357dcb07-796d-420d-b883-5304f74c724c-logs\") pod \"horizon-76ccbbb9dc-7twmk\" (UID: \"357dcb07-796d-420d-b883-5304f74c724c\") " pod="openstack/horizon-76ccbbb9dc-7twmk" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.254134 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/357dcb07-796d-420d-b883-5304f74c724c-config-data\") pod \"horizon-76ccbbb9dc-7twmk\" (UID: \"357dcb07-796d-420d-b883-5304f74c724c\") " pod="openstack/horizon-76ccbbb9dc-7twmk" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.266856 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/357dcb07-796d-420d-b883-5304f74c724c-horizon-secret-key\") pod \"horizon-76ccbbb9dc-7twmk\" (UID: \"357dcb07-796d-420d-b883-5304f74c724c\") " pod="openstack/horizon-76ccbbb9dc-7twmk" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.301298 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b275z\" (UniqueName: \"kubernetes.io/projected/357dcb07-796d-420d-b883-5304f74c724c-kube-api-access-b275z\") pod \"horizon-76ccbbb9dc-7twmk\" (UID: \"357dcb07-796d-420d-b883-5304f74c724c\") " pod="openstack/horizon-76ccbbb9dc-7twmk" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.425904 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76ccbbb9dc-7twmk" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.639412 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-vnnf4"] Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.756361 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.804148 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.804748 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-vnnf4" event={"ID":"9100f851-91ef-4d1d-8346-047e97aef7c2","Type":"ContainerStarted","Data":"ce3f7ba12c66e6acb8060db488abf810b2e3aac34385076d44bd650878ccba07"} Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.804794 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.942950 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:23 crc kubenswrapper[4936]: I0930 14:34:23.956266 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.072252 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"12dae184-f6bd-48ed-ba76-de2c2679c959\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.072330 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-config-data\") pod \"12dae184-f6bd-48ed-ba76-de2c2679c959\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.072384 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/12dae184-f6bd-48ed-ba76-de2c2679c959-ceph\") pod \"12dae184-f6bd-48ed-ba76-de2c2679c959\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.072411 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-combined-ca-bundle\") pod \"12dae184-f6bd-48ed-ba76-de2c2679c959\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.072509 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-internal-tls-certs\") pod \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.072532 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-ceph\") pod \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.072574 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-config-data\") pod \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.072614 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgmfl\" (UniqueName: \"kubernetes.io/projected/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-kube-api-access-jgmfl\") pod \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.072646 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pznb\" (UniqueName: \"kubernetes.io/projected/12dae184-f6bd-48ed-ba76-de2c2679c959-kube-api-access-5pznb\") pod \"12dae184-f6bd-48ed-ba76-de2c2679c959\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.072699 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-public-tls-certs\") pod \"12dae184-f6bd-48ed-ba76-de2c2679c959\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.072715 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.072763 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-scripts\") pod \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.072795 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-combined-ca-bundle\") pod \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.072816 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-httpd-run\") pod \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.072850 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-scripts\") pod \"12dae184-f6bd-48ed-ba76-de2c2679c959\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.072897 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12dae184-f6bd-48ed-ba76-de2c2679c959-logs\") pod \"12dae184-f6bd-48ed-ba76-de2c2679c959\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.072922 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-logs\") pod \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\" (UID: \"5ac80a48-e22c-4ce3-ba9c-fac743b80b7a\") " Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.072953 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12dae184-f6bd-48ed-ba76-de2c2679c959-httpd-run\") pod \"12dae184-f6bd-48ed-ba76-de2c2679c959\" (UID: \"12dae184-f6bd-48ed-ba76-de2c2679c959\") " Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.074363 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12dae184-f6bd-48ed-ba76-de2c2679c959-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "12dae184-f6bd-48ed-ba76-de2c2679c959" (UID: "12dae184-f6bd-48ed-ba76-de2c2679c959"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.075261 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12dae184-f6bd-48ed-ba76-de2c2679c959-logs" (OuterVolumeSpecName: "logs") pod "12dae184-f6bd-48ed-ba76-de2c2679c959" (UID: "12dae184-f6bd-48ed-ba76-de2c2679c959"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.075285 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5ac80a48-e22c-4ce3-ba9c-fac743b80b7a" (UID: "5ac80a48-e22c-4ce3-ba9c-fac743b80b7a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.077757 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-logs" (OuterVolumeSpecName: "logs") pod "5ac80a48-e22c-4ce3-ba9c-fac743b80b7a" (UID: "5ac80a48-e22c-4ce3-ba9c-fac743b80b7a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.083178 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12dae184-f6bd-48ed-ba76-de2c2679c959" (UID: "12dae184-f6bd-48ed-ba76-de2c2679c959"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.085430 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12dae184-f6bd-48ed-ba76-de2c2679c959-ceph" (OuterVolumeSpecName: "ceph") pod "12dae184-f6bd-48ed-ba76-de2c2679c959" (UID: "12dae184-f6bd-48ed-ba76-de2c2679c959"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.088915 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-config-data" (OuterVolumeSpecName: "config-data") pod "12dae184-f6bd-48ed-ba76-de2c2679c959" (UID: "12dae184-f6bd-48ed-ba76-de2c2679c959"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.089071 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "12dae184-f6bd-48ed-ba76-de2c2679c959" (UID: "12dae184-f6bd-48ed-ba76-de2c2679c959"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.096058 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "5ac80a48-e22c-4ce3-ba9c-fac743b80b7a" (UID: "5ac80a48-e22c-4ce3-ba9c-fac743b80b7a"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.096187 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12dae184-f6bd-48ed-ba76-de2c2679c959-kube-api-access-5pznb" (OuterVolumeSpecName: "kube-api-access-5pznb") pod "12dae184-f6bd-48ed-ba76-de2c2679c959" (UID: "12dae184-f6bd-48ed-ba76-de2c2679c959"). InnerVolumeSpecName "kube-api-access-5pznb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.096931 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-kube-api-access-jgmfl" (OuterVolumeSpecName: "kube-api-access-jgmfl") pod "5ac80a48-e22c-4ce3-ba9c-fac743b80b7a" (UID: "5ac80a48-e22c-4ce3-ba9c-fac743b80b7a"). InnerVolumeSpecName "kube-api-access-jgmfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.097502 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-config-data" (OuterVolumeSpecName: "config-data") pod "5ac80a48-e22c-4ce3-ba9c-fac743b80b7a" (UID: "5ac80a48-e22c-4ce3-ba9c-fac743b80b7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.097859 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-scripts" (OuterVolumeSpecName: "scripts") pod "5ac80a48-e22c-4ce3-ba9c-fac743b80b7a" (UID: "5ac80a48-e22c-4ce3-ba9c-fac743b80b7a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.098429 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ac80a48-e22c-4ce3-ba9c-fac743b80b7a" (UID: "5ac80a48-e22c-4ce3-ba9c-fac743b80b7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.116843 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5ac80a48-e22c-4ce3-ba9c-fac743b80b7a" (UID: "5ac80a48-e22c-4ce3-ba9c-fac743b80b7a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.116937 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "12dae184-f6bd-48ed-ba76-de2c2679c959" (UID: "12dae184-f6bd-48ed-ba76-de2c2679c959"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.117022 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-scripts" (OuterVolumeSpecName: "scripts") pod "12dae184-f6bd-48ed-ba76-de2c2679c959" (UID: "12dae184-f6bd-48ed-ba76-de2c2679c959"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.117125 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-ceph" (OuterVolumeSpecName: "ceph") pod "5ac80a48-e22c-4ce3-ba9c-fac743b80b7a" (UID: "5ac80a48-e22c-4ce3-ba9c-fac743b80b7a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.175536 4936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12dae184-f6bd-48ed-ba76-de2c2679c959-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.175572 4936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.175581 4936 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12dae184-f6bd-48ed-ba76-de2c2679c959-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.175611 4936 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.175625 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.175634 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/12dae184-f6bd-48ed-ba76-de2c2679c959-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.175643 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.175654 4936 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.175662 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.175670 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.175682 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgmfl\" (UniqueName: \"kubernetes.io/projected/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-kube-api-access-jgmfl\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.175696 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pznb\" (UniqueName: \"kubernetes.io/projected/12dae184-f6bd-48ed-ba76-de2c2679c959-kube-api-access-5pznb\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.175706 4936 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.175724 4936 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.175734 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.175743 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.175754 4936 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.175764 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12dae184-f6bd-48ed-ba76-de2c2679c959-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.235972 4936 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.248678 4936 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.262089 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79464b7c69-cqz8d"] Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.277673 4936 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.277720 4936 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.281957 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76ccbbb9dc-7twmk"] Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.639486 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Sep 30 14:34:24 crc kubenswrapper[4936]: W0930 14:34:24.713367 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4852707_8aff_49ed_b929_3bdcf9cd921a.slice/crio-a961c7f97ec986333d8b7ed3c7b1ebd7e6b0856b5e1924ac3a432863f90f3989 WatchSource:0}: Error finding container a961c7f97ec986333d8b7ed3c7b1ebd7e6b0856b5e1924ac3a432863f90f3989: Status 404 returned error can't find the container with id a961c7f97ec986333d8b7ed3c7b1ebd7e6b0856b5e1924ac3a432863f90f3989 Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.816069 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79464b7c69-cqz8d" event={"ID":"fcc04d7d-2a45-41e7-8c63-883df94fcd08","Type":"ContainerStarted","Data":"b4dcd2925ae384002521e9668ed98cda0217a4dd5903d40d4af8918f85073c1c"} Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.817659 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c4852707-8aff-49ed-b929-3bdcf9cd921a","Type":"ContainerStarted","Data":"a961c7f97ec986333d8b7ed3c7b1ebd7e6b0856b5e1924ac3a432863f90f3989"} Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.819653 4936 generic.go:334] "Generic (PLEG): container finished" podID="9100f851-91ef-4d1d-8346-047e97aef7c2" containerID="e7b4eb175e63344b1d89123bd36c37a92840469768c64cefe4a8974b7ea807de" exitCode=0 Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.819710 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-vnnf4" event={"ID":"9100f851-91ef-4d1d-8346-047e97aef7c2","Type":"ContainerDied","Data":"e7b4eb175e63344b1d89123bd36c37a92840469768c64cefe4a8974b7ea807de"} Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.822131 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76ccbbb9dc-7twmk" event={"ID":"357dcb07-796d-420d-b883-5304f74c724c","Type":"ContainerStarted","Data":"424549dc6d1659968c92fc5bf7d47d7d2b896c7fef88e89b5216701ed82ba63a"} Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.824202 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.824455 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.824424 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"2d6eb814-5e27-493b-b63e-e8eddf561330","Type":"ContainerStarted","Data":"336486398d636af02212d4839a251d0cbf98828d9aa8b9d11a40b8f02a115a00"} Sep 30 14:34:24 crc kubenswrapper[4936]: I0930 14:34:24.996285 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.050472 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.060741 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.062680 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.080296 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.080715 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.080827 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sxmps" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.080421 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.080960 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.096449 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.113803 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.113927 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.115987 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.118991 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.119052 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.119367 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.244661 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-logs\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.244709 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-config-data\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.244733 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.244792 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e79baab4-ef50-47f8-9533-33d2bdf54fbe-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.244815 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.244859 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.244882 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.244930 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77whx\" (UniqueName: \"kubernetes.io/projected/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-kube-api-access-77whx\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.244958 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.245173 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.245251 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.245314 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.245394 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e79baab4-ef50-47f8-9533-33d2bdf54fbe-ceph\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.245432 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e79baab4-ef50-47f8-9533-33d2bdf54fbe-logs\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.245474 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-scripts\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.245514 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh7lp\" (UniqueName: \"kubernetes.io/projected/e79baab4-ef50-47f8-9533-33d2bdf54fbe-kube-api-access-gh7lp\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.245552 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.245613 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.348108 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e79baab4-ef50-47f8-9533-33d2bdf54fbe-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.348172 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.348234 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.348254 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.348306 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77whx\" (UniqueName: \"kubernetes.io/projected/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-kube-api-access-77whx\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.348355 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.348384 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.348409 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.348489 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.348546 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e79baab4-ef50-47f8-9533-33d2bdf54fbe-ceph\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.348573 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e79baab4-ef50-47f8-9533-33d2bdf54fbe-logs\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.348615 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-scripts\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.348639 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh7lp\" (UniqueName: \"kubernetes.io/projected/e79baab4-ef50-47f8-9533-33d2bdf54fbe-kube-api-access-gh7lp\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.348689 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.348720 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.348777 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-logs\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.348792 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-config-data\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.348806 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.349367 4936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.350753 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.351524 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e79baab4-ef50-47f8-9533-33d2bdf54fbe-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.355736 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.356051 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.359550 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.360135 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-logs\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.360722 4936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.361581 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e79baab4-ef50-47f8-9533-33d2bdf54fbe-ceph\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.364238 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.364767 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e79baab4-ef50-47f8-9533-33d2bdf54fbe-logs\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.367286 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.371124 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.373487 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.384141 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-config-data\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.384741 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-scripts\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.386963 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77whx\" (UniqueName: \"kubernetes.io/projected/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-kube-api-access-77whx\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.405823 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh7lp\" (UniqueName: \"kubernetes.io/projected/e79baab4-ef50-47f8-9533-33d2bdf54fbe-kube-api-access-gh7lp\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.419905 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.420456 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.477979 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.492992 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.966916 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"2d6eb814-5e27-493b-b63e-e8eddf561330","Type":"ContainerStarted","Data":"cf026951f2cc06e7e1e86fc5e03fca3f29de80fcf0ca9d01cac12c9d2cd9c238"} Sep 30 14:34:25 crc kubenswrapper[4936]: I0930 14:34:25.967258 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"2d6eb814-5e27-493b-b63e-e8eddf561330","Type":"ContainerStarted","Data":"b28221a58be4a1f6fe8ffd72106c66865f2a36ce7dc5adca904195ed8d5668e4"} Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.105745 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=4.103763306 podStartE2EDuration="5.105726945s" podCreationTimestamp="2025-09-30 14:34:21 +0000 UTC" firstStartedPulling="2025-09-30 14:34:23.787301057 +0000 UTC m=+3314.171303348" lastFinishedPulling="2025-09-30 14:34:24.789264686 +0000 UTC m=+3315.173266987" observedRunningTime="2025-09-30 14:34:26.075896676 +0000 UTC m=+3316.459898977" watchObservedRunningTime="2025-09-30 14:34:26.105726945 +0000 UTC m=+3316.489729246" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.342915 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12dae184-f6bd-48ed-ba76-de2c2679c959" path="/var/lib/kubelet/pods/12dae184-f6bd-48ed-ba76-de2c2679c959/volumes" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.343893 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ac80a48-e22c-4ce3-ba9c-fac743b80b7a" path="/var/lib/kubelet/pods/5ac80a48-e22c-4ce3-ba9c-fac743b80b7a/volumes" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.564804 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76ccbbb9dc-7twmk"] Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.659603 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c45676df6-k4rk6"] Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.662397 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.668034 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.733124 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c45676df6-k4rk6"] Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.816570 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.846427 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfa0e282-87b9-4509-ad57-429aa110b324-logs\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.846861 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfa0e282-87b9-4509-ad57-429aa110b324-config-data\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.846982 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfa0e282-87b9-4509-ad57-429aa110b324-scripts\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.847067 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bfa0e282-87b9-4509-ad57-429aa110b324-horizon-secret-key\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.847155 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfa0e282-87b9-4509-ad57-429aa110b324-horizon-tls-certs\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.847264 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s6gw\" (UniqueName: \"kubernetes.io/projected/bfa0e282-87b9-4509-ad57-429aa110b324-kube-api-access-4s6gw\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.847388 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfa0e282-87b9-4509-ad57-429aa110b324-combined-ca-bundle\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.884616 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.899907 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79464b7c69-cqz8d"] Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.944983 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b866fc884-w2td6"] Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.947207 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.951097 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfa0e282-87b9-4509-ad57-429aa110b324-logs\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.951297 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfa0e282-87b9-4509-ad57-429aa110b324-config-data\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.951368 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfa0e282-87b9-4509-ad57-429aa110b324-scripts\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.951403 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bfa0e282-87b9-4509-ad57-429aa110b324-horizon-secret-key\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.951423 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfa0e282-87b9-4509-ad57-429aa110b324-horizon-tls-certs\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.951466 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s6gw\" (UniqueName: \"kubernetes.io/projected/bfa0e282-87b9-4509-ad57-429aa110b324-kube-api-access-4s6gw\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.951492 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfa0e282-87b9-4509-ad57-429aa110b324-combined-ca-bundle\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.952411 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfa0e282-87b9-4509-ad57-429aa110b324-scripts\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.952826 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfa0e282-87b9-4509-ad57-429aa110b324-logs\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.981554 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfa0e282-87b9-4509-ad57-429aa110b324-horizon-tls-certs\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.983126 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b866fc884-w2td6"] Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.983838 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfa0e282-87b9-4509-ad57-429aa110b324-combined-ca-bundle\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.984042 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bfa0e282-87b9-4509-ad57-429aa110b324-horizon-secret-key\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:26 crc kubenswrapper[4936]: I0930 14:34:26.985179 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfa0e282-87b9-4509-ad57-429aa110b324-config-data\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.023139 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.030161 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s6gw\" (UniqueName: \"kubernetes.io/projected/bfa0e282-87b9-4509-ad57-429aa110b324-kube-api-access-4s6gw\") pod \"horizon-7c45676df6-k4rk6\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.030170 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c4852707-8aff-49ed-b929-3bdcf9cd921a","Type":"ContainerStarted","Data":"aed6da3ee0e005d213119848f4862253e4a0351d8331e73a4c2c75e977822549"} Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.040762 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.048967 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.059151 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-horizon-tls-certs\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.063018 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqpqr\" (UniqueName: \"kubernetes.io/projected/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-kube-api-access-hqpqr\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.065000 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-logs\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.065303 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-scripts\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.065561 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-combined-ca-bundle\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.065780 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-config-data\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.065931 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-horizon-secret-key\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.167554 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-horizon-tls-certs\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.167591 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqpqr\" (UniqueName: \"kubernetes.io/projected/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-kube-api-access-hqpqr\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.167617 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-logs\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.167699 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-scripts\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.167736 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-combined-ca-bundle\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.168452 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-config-data\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.168488 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-horizon-secret-key\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.169983 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-scripts\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.170028 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-config-data\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.172255 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-logs\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.177683 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-vnnf4" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.178469 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-horizon-secret-key\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.184794 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-combined-ca-bundle\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.198512 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqpqr\" (UniqueName: \"kubernetes.io/projected/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-kube-api-access-hqpqr\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.210853 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e28ad1d-adf7-4316-9df6-db8a7c1e3933-horizon-tls-certs\") pod \"horizon-7b866fc884-w2td6\" (UID: \"1e28ad1d-adf7-4316-9df6-db8a7c1e3933\") " pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.271265 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hfnc\" (UniqueName: \"kubernetes.io/projected/9100f851-91ef-4d1d-8346-047e97aef7c2-kube-api-access-2hfnc\") pod \"9100f851-91ef-4d1d-8346-047e97aef7c2\" (UID: \"9100f851-91ef-4d1d-8346-047e97aef7c2\") " Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.283401 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9100f851-91ef-4d1d-8346-047e97aef7c2-kube-api-access-2hfnc" (OuterVolumeSpecName: "kube-api-access-2hfnc") pod "9100f851-91ef-4d1d-8346-047e97aef7c2" (UID: "9100f851-91ef-4d1d-8346-047e97aef7c2"). InnerVolumeSpecName "kube-api-access-2hfnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.376270 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hfnc\" (UniqueName: \"kubernetes.io/projected/9100f851-91ef-4d1d-8346-047e97aef7c2-kube-api-access-2hfnc\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.477822 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.574442 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:34:27 crc kubenswrapper[4936]: I0930 14:34:27.880007 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c45676df6-k4rk6"] Sep 30 14:34:28 crc kubenswrapper[4936]: I0930 14:34:28.062248 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e79baab4-ef50-47f8-9533-33d2bdf54fbe","Type":"ContainerStarted","Data":"e5e6fc85462716aa95ebb101dc5677120ee3bbda1b8f977bc491685bc66c1911"} Sep 30 14:34:28 crc kubenswrapper[4936]: I0930 14:34:28.069884 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c4852707-8aff-49ed-b929-3bdcf9cd921a","Type":"ContainerStarted","Data":"d3f3ba283a56408a9c6ee6023561d83f3f536813aa12ec4b49cc7783fde02187"} Sep 30 14:34:28 crc kubenswrapper[4936]: I0930 14:34:28.081681 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-vnnf4" event={"ID":"9100f851-91ef-4d1d-8346-047e97aef7c2","Type":"ContainerDied","Data":"ce3f7ba12c66e6acb8060db488abf810b2e3aac34385076d44bd650878ccba07"} Sep 30 14:34:28 crc kubenswrapper[4936]: I0930 14:34:28.081723 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce3f7ba12c66e6acb8060db488abf810b2e3aac34385076d44bd650878ccba07" Sep 30 14:34:28 crc kubenswrapper[4936]: I0930 14:34:28.081788 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-vnnf4" Sep 30 14:34:28 crc kubenswrapper[4936]: I0930 14:34:28.101696 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=6.073908337 podStartE2EDuration="7.101675744s" podCreationTimestamp="2025-09-30 14:34:21 +0000 UTC" firstStartedPulling="2025-09-30 14:34:24.715761377 +0000 UTC m=+3315.099763678" lastFinishedPulling="2025-09-30 14:34:25.743528784 +0000 UTC m=+3316.127531085" observedRunningTime="2025-09-30 14:34:28.100467541 +0000 UTC m=+3318.484469842" watchObservedRunningTime="2025-09-30 14:34:28.101675744 +0000 UTC m=+3318.485678045" Sep 30 14:34:28 crc kubenswrapper[4936]: I0930 14:34:28.102119 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9","Type":"ContainerStarted","Data":"85dff7efd01e16d8d4d327806d032cde78db24a787a3b902f8066fd359a6c43e"} Sep 30 14:34:28 crc kubenswrapper[4936]: I0930 14:34:28.110397 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c45676df6-k4rk6" event={"ID":"bfa0e282-87b9-4509-ad57-429aa110b324","Type":"ContainerStarted","Data":"1d92d0fbf462a16c2ee0f9999b85efc6fdfc5d0d1a9c47924d759aa4cdc850e5"} Sep 30 14:34:28 crc kubenswrapper[4936]: I0930 14:34:28.166510 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b866fc884-w2td6"] Sep 30 14:34:29 crc kubenswrapper[4936]: I0930 14:34:29.128081 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b866fc884-w2td6" event={"ID":"1e28ad1d-adf7-4316-9df6-db8a7c1e3933","Type":"ContainerStarted","Data":"a6e0ba5daef6c3aa8d3b4a81f7f2ebb093b8dc4c36854e4d5d633e6d95515ab4"} Sep 30 14:34:29 crc kubenswrapper[4936]: I0930 14:34:29.151694 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9","Type":"ContainerStarted","Data":"c47e57d09fb87645287108681368a3bfea2792443e4e6a906e8e05dab8586513"} Sep 30 14:34:29 crc kubenswrapper[4936]: I0930 14:34:29.182029 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e79baab4-ef50-47f8-9533-33d2bdf54fbe","Type":"ContainerStarted","Data":"88f2ca37f658e934504f037363d1dcf50d1216a8f97c9402d4d36fbb7a8d78d2"} Sep 30 14:34:30 crc kubenswrapper[4936]: I0930 14:34:30.196828 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e79baab4-ef50-47f8-9533-33d2bdf54fbe","Type":"ContainerStarted","Data":"ab510653aa20f414f1b76ac2e379b189fd398e33e97d4221f870ac2b3bb4a8c7"} Sep 30 14:34:30 crc kubenswrapper[4936]: I0930 14:34:30.197055 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e79baab4-ef50-47f8-9533-33d2bdf54fbe" containerName="glance-httpd" containerID="cri-o://ab510653aa20f414f1b76ac2e379b189fd398e33e97d4221f870ac2b3bb4a8c7" gracePeriod=30 Sep 30 14:34:30 crc kubenswrapper[4936]: I0930 14:34:30.197128 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e79baab4-ef50-47f8-9533-33d2bdf54fbe" containerName="glance-log" containerID="cri-o://88f2ca37f658e934504f037363d1dcf50d1216a8f97c9402d4d36fbb7a8d78d2" gracePeriod=30 Sep 30 14:34:30 crc kubenswrapper[4936]: I0930 14:34:30.213088 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9","Type":"ContainerStarted","Data":"120f648bdeb6c34baa87b7e01c3f9a840637f93980baa9f2bcc1f43e83b8c671"} Sep 30 14:34:30 crc kubenswrapper[4936]: I0930 14:34:30.213726 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9" containerName="glance-httpd" containerID="cri-o://120f648bdeb6c34baa87b7e01c3f9a840637f93980baa9f2bcc1f43e83b8c671" gracePeriod=30 Sep 30 14:34:30 crc kubenswrapper[4936]: I0930 14:34:30.213713 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9" containerName="glance-log" containerID="cri-o://c47e57d09fb87645287108681368a3bfea2792443e4e6a906e8e05dab8586513" gracePeriod=30 Sep 30 14:34:30 crc kubenswrapper[4936]: I0930 14:34:30.223891 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.223867691 podStartE2EDuration="6.223867691s" podCreationTimestamp="2025-09-30 14:34:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:34:30.21765289 +0000 UTC m=+3320.601655211" watchObservedRunningTime="2025-09-30 14:34:30.223867691 +0000 UTC m=+3320.607869992" Sep 30 14:34:30 crc kubenswrapper[4936]: I0930 14:34:30.255495 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.255472889 podStartE2EDuration="5.255472889s" podCreationTimestamp="2025-09-30 14:34:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:34:30.255427438 +0000 UTC m=+3320.639429749" watchObservedRunningTime="2025-09-30 14:34:30.255472889 +0000 UTC m=+3320.639475190" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.050940 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.139271 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.225010 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-public-tls-certs\") pod \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.225116 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-config-data\") pod \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.225143 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e79baab4-ef50-47f8-9533-33d2bdf54fbe-logs\") pod \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.225158 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e79baab4-ef50-47f8-9533-33d2bdf54fbe-httpd-run\") pod \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.225175 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-logs\") pod \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.225194 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-internal-tls-certs\") pod \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.225286 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.225308 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e79baab4-ef50-47f8-9533-33d2bdf54fbe-ceph\") pod \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.225447 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-scripts\") pod \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.225470 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77whx\" (UniqueName: \"kubernetes.io/projected/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-kube-api-access-77whx\") pod \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.225540 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-combined-ca-bundle\") pod \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.225581 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-combined-ca-bundle\") pod \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.225640 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-scripts\") pod \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.225669 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh7lp\" (UniqueName: \"kubernetes.io/projected/e79baab4-ef50-47f8-9533-33d2bdf54fbe-kube-api-access-gh7lp\") pod \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.225685 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.225708 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-config-data\") pod \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\" (UID: \"e79baab4-ef50-47f8-9533-33d2bdf54fbe\") " Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.225735 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-httpd-run\") pod \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.225826 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-ceph\") pod \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\" (UID: \"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9\") " Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.229026 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e79baab4-ef50-47f8-9533-33d2bdf54fbe-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e79baab4-ef50-47f8-9533-33d2bdf54fbe" (UID: "e79baab4-ef50-47f8-9533-33d2bdf54fbe"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.229348 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e79baab4-ef50-47f8-9533-33d2bdf54fbe-logs" (OuterVolumeSpecName: "logs") pod "e79baab4-ef50-47f8-9533-33d2bdf54fbe" (UID: "e79baab4-ef50-47f8-9533-33d2bdf54fbe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.236491 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-scripts" (OuterVolumeSpecName: "scripts") pod "5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9" (UID: "5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.239091 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-scripts" (OuterVolumeSpecName: "scripts") pod "e79baab4-ef50-47f8-9533-33d2bdf54fbe" (UID: "e79baab4-ef50-47f8-9533-33d2bdf54fbe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.239547 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-logs" (OuterVolumeSpecName: "logs") pod "5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9" (UID: "5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.239880 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-ceph" (OuterVolumeSpecName: "ceph") pod "5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9" (UID: "5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.240066 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9" (UID: "5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.244281 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-kube-api-access-77whx" (OuterVolumeSpecName: "kube-api-access-77whx") pod "5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9" (UID: "5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9"). InnerVolumeSpecName "kube-api-access-77whx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.248635 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e79baab4-ef50-47f8-9533-33d2bdf54fbe-kube-api-access-gh7lp" (OuterVolumeSpecName: "kube-api-access-gh7lp") pod "e79baab4-ef50-47f8-9533-33d2bdf54fbe" (UID: "e79baab4-ef50-47f8-9533-33d2bdf54fbe"). InnerVolumeSpecName "kube-api-access-gh7lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.255656 4936 generic.go:334] "Generic (PLEG): container finished" podID="e79baab4-ef50-47f8-9533-33d2bdf54fbe" containerID="ab510653aa20f414f1b76ac2e379b189fd398e33e97d4221f870ac2b3bb4a8c7" exitCode=143 Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.255695 4936 generic.go:334] "Generic (PLEG): container finished" podID="e79baab4-ef50-47f8-9533-33d2bdf54fbe" containerID="88f2ca37f658e934504f037363d1dcf50d1216a8f97c9402d4d36fbb7a8d78d2" exitCode=143 Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.255775 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e79baab4-ef50-47f8-9533-33d2bdf54fbe","Type":"ContainerDied","Data":"ab510653aa20f414f1b76ac2e379b189fd398e33e97d4221f870ac2b3bb4a8c7"} Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.255807 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e79baab4-ef50-47f8-9533-33d2bdf54fbe","Type":"ContainerDied","Data":"88f2ca37f658e934504f037363d1dcf50d1216a8f97c9402d4d36fbb7a8d78d2"} Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.255819 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e79baab4-ef50-47f8-9533-33d2bdf54fbe","Type":"ContainerDied","Data":"e5e6fc85462716aa95ebb101dc5677120ee3bbda1b8f977bc491685bc66c1911"} Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.255836 4936 scope.go:117] "RemoveContainer" containerID="ab510653aa20f414f1b76ac2e379b189fd398e33e97d4221f870ac2b3bb4a8c7" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.256002 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.259192 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e79baab4-ef50-47f8-9533-33d2bdf54fbe-ceph" (OuterVolumeSpecName: "ceph") pod "e79baab4-ef50-47f8-9533-33d2bdf54fbe" (UID: "e79baab4-ef50-47f8-9533-33d2bdf54fbe"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.259827 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "e79baab4-ef50-47f8-9533-33d2bdf54fbe" (UID: "e79baab4-ef50-47f8-9533-33d2bdf54fbe"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.266583 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9" (UID: "5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.284050 4936 generic.go:334] "Generic (PLEG): container finished" podID="5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9" containerID="120f648bdeb6c34baa87b7e01c3f9a840637f93980baa9f2bcc1f43e83b8c671" exitCode=143 Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.284086 4936 generic.go:334] "Generic (PLEG): container finished" podID="5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9" containerID="c47e57d09fb87645287108681368a3bfea2792443e4e6a906e8e05dab8586513" exitCode=143 Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.284108 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9","Type":"ContainerDied","Data":"120f648bdeb6c34baa87b7e01c3f9a840637f93980baa9f2bcc1f43e83b8c671"} Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.284137 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9","Type":"ContainerDied","Data":"c47e57d09fb87645287108681368a3bfea2792443e4e6a906e8e05dab8586513"} Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.284147 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9","Type":"ContainerDied","Data":"85dff7efd01e16d8d4d327806d032cde78db24a787a3b902f8066fd359a6c43e"} Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.284252 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.303954 4936 scope.go:117] "RemoveContainer" containerID="88f2ca37f658e934504f037363d1dcf50d1216a8f97c9402d4d36fbb7a8d78d2" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.318250 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9" (UID: "5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.330019 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.330066 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77whx\" (UniqueName: \"kubernetes.io/projected/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-kube-api-access-77whx\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.330080 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.330091 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.330105 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh7lp\" (UniqueName: \"kubernetes.io/projected/e79baab4-ef50-47f8-9533-33d2bdf54fbe-kube-api-access-gh7lp\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.330138 4936 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.330150 4936 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.330163 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.330173 4936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e79baab4-ef50-47f8-9533-33d2bdf54fbe-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.330184 4936 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e79baab4-ef50-47f8-9533-33d2bdf54fbe-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.330194 4936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.330211 4936 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.330234 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e79baab4-ef50-47f8-9533-33d2bdf54fbe-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.351846 4936 scope.go:117] "RemoveContainer" containerID="ab510653aa20f414f1b76ac2e379b189fd398e33e97d4221f870ac2b3bb4a8c7" Sep 30 14:34:31 crc kubenswrapper[4936]: E0930 14:34:31.352975 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab510653aa20f414f1b76ac2e379b189fd398e33e97d4221f870ac2b3bb4a8c7\": container with ID starting with ab510653aa20f414f1b76ac2e379b189fd398e33e97d4221f870ac2b3bb4a8c7 not found: ID does not exist" containerID="ab510653aa20f414f1b76ac2e379b189fd398e33e97d4221f870ac2b3bb4a8c7" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.353032 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab510653aa20f414f1b76ac2e379b189fd398e33e97d4221f870ac2b3bb4a8c7"} err="failed to get container status \"ab510653aa20f414f1b76ac2e379b189fd398e33e97d4221f870ac2b3bb4a8c7\": rpc error: code = NotFound desc = could not find container \"ab510653aa20f414f1b76ac2e379b189fd398e33e97d4221f870ac2b3bb4a8c7\": container with ID starting with ab510653aa20f414f1b76ac2e379b189fd398e33e97d4221f870ac2b3bb4a8c7 not found: ID does not exist" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.353062 4936 scope.go:117] "RemoveContainer" containerID="88f2ca37f658e934504f037363d1dcf50d1216a8f97c9402d4d36fbb7a8d78d2" Sep 30 14:34:31 crc kubenswrapper[4936]: E0930 14:34:31.355978 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88f2ca37f658e934504f037363d1dcf50d1216a8f97c9402d4d36fbb7a8d78d2\": container with ID starting with 88f2ca37f658e934504f037363d1dcf50d1216a8f97c9402d4d36fbb7a8d78d2 not found: ID does not exist" containerID="88f2ca37f658e934504f037363d1dcf50d1216a8f97c9402d4d36fbb7a8d78d2" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.356029 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f2ca37f658e934504f037363d1dcf50d1216a8f97c9402d4d36fbb7a8d78d2"} err="failed to get container status \"88f2ca37f658e934504f037363d1dcf50d1216a8f97c9402d4d36fbb7a8d78d2\": rpc error: code = NotFound desc = could not find container \"88f2ca37f658e934504f037363d1dcf50d1216a8f97c9402d4d36fbb7a8d78d2\": container with ID starting with 88f2ca37f658e934504f037363d1dcf50d1216a8f97c9402d4d36fbb7a8d78d2 not found: ID does not exist" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.356058 4936 scope.go:117] "RemoveContainer" containerID="ab510653aa20f414f1b76ac2e379b189fd398e33e97d4221f870ac2b3bb4a8c7" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.367916 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e79baab4-ef50-47f8-9533-33d2bdf54fbe" (UID: "e79baab4-ef50-47f8-9533-33d2bdf54fbe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.379775 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e79baab4-ef50-47f8-9533-33d2bdf54fbe" (UID: "e79baab4-ef50-47f8-9533-33d2bdf54fbe"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.399978 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-config-data" (OuterVolumeSpecName: "config-data") pod "5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9" (UID: "5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.400834 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab510653aa20f414f1b76ac2e379b189fd398e33e97d4221f870ac2b3bb4a8c7"} err="failed to get container status \"ab510653aa20f414f1b76ac2e379b189fd398e33e97d4221f870ac2b3bb4a8c7\": rpc error: code = NotFound desc = could not find container \"ab510653aa20f414f1b76ac2e379b189fd398e33e97d4221f870ac2b3bb4a8c7\": container with ID starting with ab510653aa20f414f1b76ac2e379b189fd398e33e97d4221f870ac2b3bb4a8c7 not found: ID does not exist" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.400885 4936 scope.go:117] "RemoveContainer" containerID="88f2ca37f658e934504f037363d1dcf50d1216a8f97c9402d4d36fbb7a8d78d2" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.400867 4936 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.405009 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f2ca37f658e934504f037363d1dcf50d1216a8f97c9402d4d36fbb7a8d78d2"} err="failed to get container status \"88f2ca37f658e934504f037363d1dcf50d1216a8f97c9402d4d36fbb7a8d78d2\": rpc error: code = NotFound desc = could not find container \"88f2ca37f658e934504f037363d1dcf50d1216a8f97c9402d4d36fbb7a8d78d2\": container with ID starting with 88f2ca37f658e934504f037363d1dcf50d1216a8f97c9402d4d36fbb7a8d78d2 not found: ID does not exist" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.405065 4936 scope.go:117] "RemoveContainer" containerID="120f648bdeb6c34baa87b7e01c3f9a840637f93980baa9f2bcc1f43e83b8c671" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.426377 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-config-data" (OuterVolumeSpecName: "config-data") pod "e79baab4-ef50-47f8-9533-33d2bdf54fbe" (UID: "e79baab4-ef50-47f8-9533-33d2bdf54fbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.432771 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.432807 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.432815 4936 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e79baab4-ef50-47f8-9533-33d2bdf54fbe-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.432827 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.432839 4936 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.451356 4936 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.522816 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9" (UID: "5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.536564 4936 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.536596 4936 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.579591 4936 scope.go:117] "RemoveContainer" containerID="c47e57d09fb87645287108681368a3bfea2792443e4e6a906e8e05dab8586513" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.632101 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.649958 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.658352 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.678580 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.687894 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:34:31 crc kubenswrapper[4936]: E0930 14:34:31.688299 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9" containerName="glance-log" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.688310 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9" containerName="glance-log" Sep 30 14:34:31 crc kubenswrapper[4936]: E0930 14:34:31.688329 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e79baab4-ef50-47f8-9533-33d2bdf54fbe" containerName="glance-log" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.688358 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="e79baab4-ef50-47f8-9533-33d2bdf54fbe" containerName="glance-log" Sep 30 14:34:31 crc kubenswrapper[4936]: E0930 14:34:31.688383 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e79baab4-ef50-47f8-9533-33d2bdf54fbe" containerName="glance-httpd" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.688390 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="e79baab4-ef50-47f8-9533-33d2bdf54fbe" containerName="glance-httpd" Sep 30 14:34:31 crc kubenswrapper[4936]: E0930 14:34:31.688408 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9" containerName="glance-httpd" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.688413 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9" containerName="glance-httpd" Sep 30 14:34:31 crc kubenswrapper[4936]: E0930 14:34:31.688427 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9100f851-91ef-4d1d-8346-047e97aef7c2" containerName="mariadb-database-create" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.688442 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9100f851-91ef-4d1d-8346-047e97aef7c2" containerName="mariadb-database-create" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.688603 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9" containerName="glance-log" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.688617 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="9100f851-91ef-4d1d-8346-047e97aef7c2" containerName="mariadb-database-create" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.688630 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="e79baab4-ef50-47f8-9533-33d2bdf54fbe" containerName="glance-httpd" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.688643 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="e79baab4-ef50-47f8-9533-33d2bdf54fbe" containerName="glance-log" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.688651 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9" containerName="glance-httpd" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.689683 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.696926 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.697159 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sxmps" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.697254 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.702403 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.703218 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.710321 4936 scope.go:117] "RemoveContainer" containerID="120f648bdeb6c34baa87b7e01c3f9a840637f93980baa9f2bcc1f43e83b8c671" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.711515 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:34:31 crc kubenswrapper[4936]: E0930 14:34:31.718604 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"120f648bdeb6c34baa87b7e01c3f9a840637f93980baa9f2bcc1f43e83b8c671\": container with ID starting with 120f648bdeb6c34baa87b7e01c3f9a840637f93980baa9f2bcc1f43e83b8c671 not found: ID does not exist" containerID="120f648bdeb6c34baa87b7e01c3f9a840637f93980baa9f2bcc1f43e83b8c671" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.718652 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"120f648bdeb6c34baa87b7e01c3f9a840637f93980baa9f2bcc1f43e83b8c671"} err="failed to get container status \"120f648bdeb6c34baa87b7e01c3f9a840637f93980baa9f2bcc1f43e83b8c671\": rpc error: code = NotFound desc = could not find container \"120f648bdeb6c34baa87b7e01c3f9a840637f93980baa9f2bcc1f43e83b8c671\": container with ID starting with 120f648bdeb6c34baa87b7e01c3f9a840637f93980baa9f2bcc1f43e83b8c671 not found: ID does not exist" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.718692 4936 scope.go:117] "RemoveContainer" containerID="c47e57d09fb87645287108681368a3bfea2792443e4e6a906e8e05dab8586513" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.720366 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.722541 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:34:31 crc kubenswrapper[4936]: E0930 14:34:31.724656 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c47e57d09fb87645287108681368a3bfea2792443e4e6a906e8e05dab8586513\": container with ID starting with c47e57d09fb87645287108681368a3bfea2792443e4e6a906e8e05dab8586513 not found: ID does not exist" containerID="c47e57d09fb87645287108681368a3bfea2792443e4e6a906e8e05dab8586513" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.724708 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c47e57d09fb87645287108681368a3bfea2792443e4e6a906e8e05dab8586513"} err="failed to get container status \"c47e57d09fb87645287108681368a3bfea2792443e4e6a906e8e05dab8586513\": rpc error: code = NotFound desc = could not find container \"c47e57d09fb87645287108681368a3bfea2792443e4e6a906e8e05dab8586513\": container with ID starting with c47e57d09fb87645287108681368a3bfea2792443e4e6a906e8e05dab8586513 not found: ID does not exist" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.724741 4936 scope.go:117] "RemoveContainer" containerID="120f648bdeb6c34baa87b7e01c3f9a840637f93980baa9f2bcc1f43e83b8c671" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.724958 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.725145 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.740640 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e9446dfe-8843-48eb-8514-bccc85f0727e-ceph\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.740725 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9446dfe-8843-48eb-8514-bccc85f0727e-logs\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.740789 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9446dfe-8843-48eb-8514-bccc85f0727e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.740810 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9446dfe-8843-48eb-8514-bccc85f0727e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.740953 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9wms\" (UniqueName: \"kubernetes.io/projected/e9446dfe-8843-48eb-8514-bccc85f0727e-kube-api-access-f9wms\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.741024 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9446dfe-8843-48eb-8514-bccc85f0727e-config-data\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.741093 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.741122 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9446dfe-8843-48eb-8514-bccc85f0727e-scripts\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.741152 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9446dfe-8843-48eb-8514-bccc85f0727e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.749216 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"120f648bdeb6c34baa87b7e01c3f9a840637f93980baa9f2bcc1f43e83b8c671"} err="failed to get container status \"120f648bdeb6c34baa87b7e01c3f9a840637f93980baa9f2bcc1f43e83b8c671\": rpc error: code = NotFound desc = could not find container \"120f648bdeb6c34baa87b7e01c3f9a840637f93980baa9f2bcc1f43e83b8c671\": container with ID starting with 120f648bdeb6c34baa87b7e01c3f9a840637f93980baa9f2bcc1f43e83b8c671 not found: ID does not exist" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.749253 4936 scope.go:117] "RemoveContainer" containerID="c47e57d09fb87645287108681368a3bfea2792443e4e6a906e8e05dab8586513" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.749720 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c47e57d09fb87645287108681368a3bfea2792443e4e6a906e8e05dab8586513"} err="failed to get container status \"c47e57d09fb87645287108681368a3bfea2792443e4e6a906e8e05dab8586513\": rpc error: code = NotFound desc = could not find container \"c47e57d09fb87645287108681368a3bfea2792443e4e6a906e8e05dab8586513\": container with ID starting with c47e57d09fb87645287108681368a3bfea2792443e4e6a906e8e05dab8586513 not found: ID does not exist" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.846798 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9446dfe-8843-48eb-8514-bccc85f0727e-config-data\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.846842 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.846897 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.846922 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9446dfe-8843-48eb-8514-bccc85f0727e-scripts\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.846947 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9446dfe-8843-48eb-8514-bccc85f0727e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.846974 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3f555e7b-a6ae-4ab9-b5f1-d89581768669-ceph\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.847006 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f555e7b-a6ae-4ab9-b5f1-d89581768669-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.847045 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e9446dfe-8843-48eb-8514-bccc85f0727e-ceph\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.847102 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f555e7b-a6ae-4ab9-b5f1-d89581768669-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.847145 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9446dfe-8843-48eb-8514-bccc85f0727e-logs\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.847179 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f555e7b-a6ae-4ab9-b5f1-d89581768669-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.847206 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f555e7b-a6ae-4ab9-b5f1-d89581768669-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.847232 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f555e7b-a6ae-4ab9-b5f1-d89581768669-logs\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.847252 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9446dfe-8843-48eb-8514-bccc85f0727e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.847279 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9446dfe-8843-48eb-8514-bccc85f0727e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.847305 4936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.847509 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fmdj\" (UniqueName: \"kubernetes.io/projected/3f555e7b-a6ae-4ab9-b5f1-d89581768669-kube-api-access-5fmdj\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.847576 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f555e7b-a6ae-4ab9-b5f1-d89581768669-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.847628 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9wms\" (UniqueName: \"kubernetes.io/projected/e9446dfe-8843-48eb-8514-bccc85f0727e-kube-api-access-f9wms\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.848347 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9446dfe-8843-48eb-8514-bccc85f0727e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.850389 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9446dfe-8843-48eb-8514-bccc85f0727e-logs\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.857491 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9446dfe-8843-48eb-8514-bccc85f0727e-config-data\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.865149 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e9446dfe-8843-48eb-8514-bccc85f0727e-ceph\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.865775 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9446dfe-8843-48eb-8514-bccc85f0727e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.866479 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9446dfe-8843-48eb-8514-bccc85f0727e-scripts\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.867411 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9wms\" (UniqueName: \"kubernetes.io/projected/e9446dfe-8843-48eb-8514-bccc85f0727e-kube-api-access-f9wms\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.872545 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.882315 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9446dfe-8843-48eb-8514-bccc85f0727e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.897415 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e9446dfe-8843-48eb-8514-bccc85f0727e\") " pod="openstack/glance-default-external-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.949103 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f555e7b-a6ae-4ab9-b5f1-d89581768669-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.949181 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f555e7b-a6ae-4ab9-b5f1-d89581768669-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.949204 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f555e7b-a6ae-4ab9-b5f1-d89581768669-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.949235 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f555e7b-a6ae-4ab9-b5f1-d89581768669-logs\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.949285 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fmdj\" (UniqueName: \"kubernetes.io/projected/3f555e7b-a6ae-4ab9-b5f1-d89581768669-kube-api-access-5fmdj\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.949327 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f555e7b-a6ae-4ab9-b5f1-d89581768669-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.949395 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.949438 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3f555e7b-a6ae-4ab9-b5f1-d89581768669-ceph\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.949473 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f555e7b-a6ae-4ab9-b5f1-d89581768669-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.951213 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f555e7b-a6ae-4ab9-b5f1-d89581768669-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.953999 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f555e7b-a6ae-4ab9-b5f1-d89581768669-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.954656 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f555e7b-a6ae-4ab9-b5f1-d89581768669-logs\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.954668 4936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.957087 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f555e7b-a6ae-4ab9-b5f1-d89581768669-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.962536 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3f555e7b-a6ae-4ab9-b5f1-d89581768669-ceph\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.967202 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f555e7b-a6ae-4ab9-b5f1-d89581768669-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.970036 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f555e7b-a6ae-4ab9-b5f1-d89581768669-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:31 crc kubenswrapper[4936]: I0930 14:34:31.974569 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fmdj\" (UniqueName: \"kubernetes.io/projected/3f555e7b-a6ae-4ab9-b5f1-d89581768669-kube-api-access-5fmdj\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:32 crc kubenswrapper[4936]: I0930 14:34:32.016033 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"3f555e7b-a6ae-4ab9-b5f1-d89581768669\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:34:32 crc kubenswrapper[4936]: I0930 14:34:32.039844 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 14:34:32 crc kubenswrapper[4936]: I0930 14:34:32.078457 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:32 crc kubenswrapper[4936]: I0930 14:34:32.182168 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Sep 30 14:34:32 crc kubenswrapper[4936]: I0930 14:34:32.186145 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Sep 30 14:34:32 crc kubenswrapper[4936]: I0930 14:34:32.409513 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9" path="/var/lib/kubelet/pods/5b7b9c5a-0479-4447-9a6e-bf4ed8020aa9/volumes" Sep 30 14:34:32 crc kubenswrapper[4936]: I0930 14:34:32.425600 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e79baab4-ef50-47f8-9533-33d2bdf54fbe" path="/var/lib/kubelet/pods/e79baab4-ef50-47f8-9533-33d2bdf54fbe/volumes" Sep 30 14:34:32 crc kubenswrapper[4936]: W0930 14:34:32.944782 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9446dfe_8843_48eb_8514_bccc85f0727e.slice/crio-481e4e95ee89290122e1fc0d143797874789e97d3c9d0d7fb08fadf1f806ef55 WatchSource:0}: Error finding container 481e4e95ee89290122e1fc0d143797874789e97d3c9d0d7fb08fadf1f806ef55: Status 404 returned error can't find the container with id 481e4e95ee89290122e1fc0d143797874789e97d3c9d0d7fb08fadf1f806ef55 Sep 30 14:34:32 crc kubenswrapper[4936]: I0930 14:34:32.953301 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:34:33 crc kubenswrapper[4936]: I0930 14:34:33.110539 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:34:33 crc kubenswrapper[4936]: I0930 14:34:33.473551 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f555e7b-a6ae-4ab9-b5f1-d89581768669","Type":"ContainerStarted","Data":"61a73ca3214f9d84d05493eeeaf2eaf4823cd2a941fa8e6ddd298cd7e4ceeda7"} Sep 30 14:34:33 crc kubenswrapper[4936]: I0930 14:34:33.476734 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e9446dfe-8843-48eb-8514-bccc85f0727e","Type":"ContainerStarted","Data":"481e4e95ee89290122e1fc0d143797874789e97d3c9d0d7fb08fadf1f806ef55"} Sep 30 14:34:34 crc kubenswrapper[4936]: I0930 14:34:34.527506 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f555e7b-a6ae-4ab9-b5f1-d89581768669","Type":"ContainerStarted","Data":"239bc02478caa8aaf3fb50cb161f32241a58d994c1d3a4db09c286a0f660c272"} Sep 30 14:34:34 crc kubenswrapper[4936]: I0930 14:34:34.533693 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e9446dfe-8843-48eb-8514-bccc85f0727e","Type":"ContainerStarted","Data":"135d893e8885ef6e93150f3c5f2302ff25fbf1fb5754f0a58eae0c31c511696e"} Sep 30 14:34:35 crc kubenswrapper[4936]: I0930 14:34:35.555491 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f555e7b-a6ae-4ab9-b5f1-d89581768669","Type":"ContainerStarted","Data":"3a619dcf978e2de3f01b1bf88cb5ecd17ad2546df1890eaba6933015d6e62de7"} Sep 30 14:34:35 crc kubenswrapper[4936]: I0930 14:34:35.564980 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e9446dfe-8843-48eb-8514-bccc85f0727e","Type":"ContainerStarted","Data":"306a581bfd161e5858a0fc67f224cefe321f914830c516814b5c5a06fdd7bdd9"} Sep 30 14:34:35 crc kubenswrapper[4936]: I0930 14:34:35.595825 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.595805135 podStartE2EDuration="4.595805135s" podCreationTimestamp="2025-09-30 14:34:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:34:35.594392626 +0000 UTC m=+3325.978394957" watchObservedRunningTime="2025-09-30 14:34:35.595805135 +0000 UTC m=+3325.979807436" Sep 30 14:34:41 crc kubenswrapper[4936]: I0930 14:34:41.630474 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b866fc884-w2td6" event={"ID":"1e28ad1d-adf7-4316-9df6-db8a7c1e3933","Type":"ContainerStarted","Data":"6536261db613157b12a0dab0a3ceef43223ace88522f6002685a11589efe7d61"} Sep 30 14:34:41 crc kubenswrapper[4936]: I0930 14:34:41.632800 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76ccbbb9dc-7twmk" event={"ID":"357dcb07-796d-420d-b883-5304f74c724c","Type":"ContainerStarted","Data":"76ed92d790453891ddab98946c7ab4f83ce726c2abd2382c0dfabe49ccbf0a38"} Sep 30 14:34:41 crc kubenswrapper[4936]: I0930 14:34:41.633896 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79464b7c69-cqz8d" event={"ID":"fcc04d7d-2a45-41e7-8c63-883df94fcd08","Type":"ContainerStarted","Data":"806c9e049c2e75f59ca8cce527d23de3a71feff91f0cdfcbbab2b23fa1adeb5f"} Sep 30 14:34:41 crc kubenswrapper[4936]: I0930 14:34:41.634985 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c45676df6-k4rk6" event={"ID":"bfa0e282-87b9-4509-ad57-429aa110b324","Type":"ContainerStarted","Data":"f65a829414e0aff49a640721be56f1daee31231f70add732897b553b8335767a"} Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.041600 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.041667 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.079774 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.080164 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.090046 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.091910 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.119307 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.119282417 podStartE2EDuration="11.119282417s" podCreationTimestamp="2025-09-30 14:34:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:34:35.670728414 +0000 UTC m=+3326.054730715" watchObservedRunningTime="2025-09-30 14:34:42.119282417 +0000 UTC m=+3332.503284718" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.125127 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.144864 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.647698 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b866fc884-w2td6" event={"ID":"1e28ad1d-adf7-4316-9df6-db8a7c1e3933","Type":"ContainerStarted","Data":"4a94f16fe2840424daa44165b9f680e409954108884ef70879ace7ca6c302cf2"} Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.650429 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76ccbbb9dc-7twmk" event={"ID":"357dcb07-796d-420d-b883-5304f74c724c","Type":"ContainerStarted","Data":"e9ed93be242f7bcef6328891298d921864531c9ecf8b9fcbbcb667c21b35460b"} Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.650519 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76ccbbb9dc-7twmk" podUID="357dcb07-796d-420d-b883-5304f74c724c" containerName="horizon" containerID="cri-o://e9ed93be242f7bcef6328891298d921864531c9ecf8b9fcbbcb667c21b35460b" gracePeriod=30 Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.650495 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76ccbbb9dc-7twmk" podUID="357dcb07-796d-420d-b883-5304f74c724c" containerName="horizon-log" containerID="cri-o://76ed92d790453891ddab98946c7ab4f83ce726c2abd2382c0dfabe49ccbf0a38" gracePeriod=30 Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.652883 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79464b7c69-cqz8d" event={"ID":"fcc04d7d-2a45-41e7-8c63-883df94fcd08","Type":"ContainerStarted","Data":"afa56e03b5f69f83a340420ae242cad8453c2abdf1af58861ad46719f0534486"} Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.653041 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79464b7c69-cqz8d" podUID="fcc04d7d-2a45-41e7-8c63-883df94fcd08" containerName="horizon-log" containerID="cri-o://806c9e049c2e75f59ca8cce527d23de3a71feff91f0cdfcbbab2b23fa1adeb5f" gracePeriod=30 Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.653048 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79464b7c69-cqz8d" podUID="fcc04d7d-2a45-41e7-8c63-883df94fcd08" containerName="horizon" containerID="cri-o://afa56e03b5f69f83a340420ae242cad8453c2abdf1af58861ad46719f0534486" gracePeriod=30 Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.665497 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c45676df6-k4rk6" event={"ID":"bfa0e282-87b9-4509-ad57-429aa110b324","Type":"ContainerStarted","Data":"16366588f2d33c86e7f8823a43493c3e6b00d89b4c8628779ccffe5c17a02362"} Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.665561 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.665575 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.665981 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.666328 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.682642 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7b866fc884-w2td6" podStartSLOduration=3.769882538 podStartE2EDuration="16.682626725s" podCreationTimestamp="2025-09-30 14:34:26 +0000 UTC" firstStartedPulling="2025-09-30 14:34:28.188766067 +0000 UTC m=+3318.572768368" lastFinishedPulling="2025-09-30 14:34:41.101510254 +0000 UTC m=+3331.485512555" observedRunningTime="2025-09-30 14:34:42.680566168 +0000 UTC m=+3333.064568469" watchObservedRunningTime="2025-09-30 14:34:42.682626725 +0000 UTC m=+3333.066629016" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.714896 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-3a0a-account-create-ldfjs"] Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.716191 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3a0a-account-create-ldfjs" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.717160 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-76ccbbb9dc-7twmk" podStartSLOduration=3.953119423 podStartE2EDuration="20.717137703s" podCreationTimestamp="2025-09-30 14:34:22 +0000 UTC" firstStartedPulling="2025-09-30 14:34:24.295807588 +0000 UTC m=+3314.679809889" lastFinishedPulling="2025-09-30 14:34:41.059825868 +0000 UTC m=+3331.443828169" observedRunningTime="2025-09-30 14:34:42.702739538 +0000 UTC m=+3333.086741859" watchObservedRunningTime="2025-09-30 14:34:42.717137703 +0000 UTC m=+3333.101140004" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.739756 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.740966 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-3a0a-account-create-ldfjs"] Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.750764 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-79464b7c69-cqz8d" podStartSLOduration=3.882188635 podStartE2EDuration="20.750747737s" podCreationTimestamp="2025-09-30 14:34:22 +0000 UTC" firstStartedPulling="2025-09-30 14:34:24.269831535 +0000 UTC m=+3314.653833836" lastFinishedPulling="2025-09-30 14:34:41.138390637 +0000 UTC m=+3331.522392938" observedRunningTime="2025-09-30 14:34:42.746602973 +0000 UTC m=+3333.130605284" watchObservedRunningTime="2025-09-30 14:34:42.750747737 +0000 UTC m=+3333.134750038" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.797769 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7c45676df6-k4rk6" podStartSLOduration=3.646202931 podStartE2EDuration="16.797746578s" podCreationTimestamp="2025-09-30 14:34:26 +0000 UTC" firstStartedPulling="2025-09-30 14:34:27.931392036 +0000 UTC m=+3318.315394337" lastFinishedPulling="2025-09-30 14:34:41.082935683 +0000 UTC m=+3331.466937984" observedRunningTime="2025-09-30 14:34:42.785700017 +0000 UTC m=+3333.169702318" watchObservedRunningTime="2025-09-30 14:34:42.797746578 +0000 UTC m=+3333.181748889" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.836079 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxjwn\" (UniqueName: \"kubernetes.io/projected/bc1a270c-bc12-4d84-9482-b51a8db4be0d-kube-api-access-dxjwn\") pod \"manila-3a0a-account-create-ldfjs\" (UID: \"bc1a270c-bc12-4d84-9482-b51a8db4be0d\") " pod="openstack/manila-3a0a-account-create-ldfjs" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.937557 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxjwn\" (UniqueName: \"kubernetes.io/projected/bc1a270c-bc12-4d84-9482-b51a8db4be0d-kube-api-access-dxjwn\") pod \"manila-3a0a-account-create-ldfjs\" (UID: \"bc1a270c-bc12-4d84-9482-b51a8db4be0d\") " pod="openstack/manila-3a0a-account-create-ldfjs" Sep 30 14:34:42 crc kubenswrapper[4936]: I0930 14:34:42.971870 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxjwn\" (UniqueName: \"kubernetes.io/projected/bc1a270c-bc12-4d84-9482-b51a8db4be0d-kube-api-access-dxjwn\") pod \"manila-3a0a-account-create-ldfjs\" (UID: \"bc1a270c-bc12-4d84-9482-b51a8db4be0d\") " pod="openstack/manila-3a0a-account-create-ldfjs" Sep 30 14:34:43 crc kubenswrapper[4936]: I0930 14:34:43.036041 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3a0a-account-create-ldfjs" Sep 30 14:34:43 crc kubenswrapper[4936]: I0930 14:34:43.044735 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-79464b7c69-cqz8d" Sep 30 14:34:43 crc kubenswrapper[4936]: I0930 14:34:43.427515 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-76ccbbb9dc-7twmk" Sep 30 14:34:43 crc kubenswrapper[4936]: W0930 14:34:43.630404 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc1a270c_bc12_4d84_9482_b51a8db4be0d.slice/crio-23149afb0fa4d263a8bb2783588b4fdb107fbc6a73d5bbcdf623418f5e7758cb WatchSource:0}: Error finding container 23149afb0fa4d263a8bb2783588b4fdb107fbc6a73d5bbcdf623418f5e7758cb: Status 404 returned error can't find the container with id 23149afb0fa4d263a8bb2783588b4fdb107fbc6a73d5bbcdf623418f5e7758cb Sep 30 14:34:43 crc kubenswrapper[4936]: I0930 14:34:43.635730 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-3a0a-account-create-ldfjs"] Sep 30 14:34:43 crc kubenswrapper[4936]: I0930 14:34:43.678284 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-3a0a-account-create-ldfjs" event={"ID":"bc1a270c-bc12-4d84-9482-b51a8db4be0d","Type":"ContainerStarted","Data":"23149afb0fa4d263a8bb2783588b4fdb107fbc6a73d5bbcdf623418f5e7758cb"} Sep 30 14:34:44 crc kubenswrapper[4936]: I0930 14:34:44.689027 4936 generic.go:334] "Generic (PLEG): container finished" podID="bc1a270c-bc12-4d84-9482-b51a8db4be0d" containerID="461fa2fc5e8c4739240d09d131792e4d0f74a72a1c6d56eb302702d6d8dd4214" exitCode=0 Sep 30 14:34:44 crc kubenswrapper[4936]: I0930 14:34:44.689460 4936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 14:34:44 crc kubenswrapper[4936]: I0930 14:34:44.689562 4936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 14:34:44 crc kubenswrapper[4936]: I0930 14:34:44.689627 4936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 14:34:44 crc kubenswrapper[4936]: I0930 14:34:44.689649 4936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 14:34:44 crc kubenswrapper[4936]: I0930 14:34:44.689588 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-3a0a-account-create-ldfjs" event={"ID":"bc1a270c-bc12-4d84-9482-b51a8db4be0d","Type":"ContainerDied","Data":"461fa2fc5e8c4739240d09d131792e4d0f74a72a1c6d56eb302702d6d8dd4214"} Sep 30 14:34:46 crc kubenswrapper[4936]: I0930 14:34:46.183520 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3a0a-account-create-ldfjs" Sep 30 14:34:46 crc kubenswrapper[4936]: I0930 14:34:46.311498 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxjwn\" (UniqueName: \"kubernetes.io/projected/bc1a270c-bc12-4d84-9482-b51a8db4be0d-kube-api-access-dxjwn\") pod \"bc1a270c-bc12-4d84-9482-b51a8db4be0d\" (UID: \"bc1a270c-bc12-4d84-9482-b51a8db4be0d\") " Sep 30 14:34:46 crc kubenswrapper[4936]: I0930 14:34:46.338262 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc1a270c-bc12-4d84-9482-b51a8db4be0d-kube-api-access-dxjwn" (OuterVolumeSpecName: "kube-api-access-dxjwn") pod "bc1a270c-bc12-4d84-9482-b51a8db4be0d" (UID: "bc1a270c-bc12-4d84-9482-b51a8db4be0d"). InnerVolumeSpecName "kube-api-access-dxjwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:34:46 crc kubenswrapper[4936]: I0930 14:34:46.414180 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxjwn\" (UniqueName: \"kubernetes.io/projected/bc1a270c-bc12-4d84-9482-b51a8db4be0d-kube-api-access-dxjwn\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:46 crc kubenswrapper[4936]: I0930 14:34:46.707293 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-3a0a-account-create-ldfjs" event={"ID":"bc1a270c-bc12-4d84-9482-b51a8db4be0d","Type":"ContainerDied","Data":"23149afb0fa4d263a8bb2783588b4fdb107fbc6a73d5bbcdf623418f5e7758cb"} Sep 30 14:34:46 crc kubenswrapper[4936]: I0930 14:34:46.707353 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23149afb0fa4d263a8bb2783588b4fdb107fbc6a73d5bbcdf623418f5e7758cb" Sep 30 14:34:46 crc kubenswrapper[4936]: I0930 14:34:46.707416 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3a0a-account-create-ldfjs" Sep 30 14:34:47 crc kubenswrapper[4936]: I0930 14:34:47.041425 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:47 crc kubenswrapper[4936]: I0930 14:34:47.041467 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:34:47 crc kubenswrapper[4936]: I0930 14:34:47.260228 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 14:34:47 crc kubenswrapper[4936]: I0930 14:34:47.260640 4936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 14:34:47 crc kubenswrapper[4936]: I0930 14:34:47.261261 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 14:34:47 crc kubenswrapper[4936]: I0930 14:34:47.292964 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:47 crc kubenswrapper[4936]: I0930 14:34:47.293062 4936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 14:34:47 crc kubenswrapper[4936]: I0930 14:34:47.296289 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 14:34:47 crc kubenswrapper[4936]: I0930 14:34:47.479055 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:47 crc kubenswrapper[4936]: I0930 14:34:47.479705 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:34:47 crc kubenswrapper[4936]: I0930 14:34:47.990018 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-2cqnc"] Sep 30 14:34:47 crc kubenswrapper[4936]: E0930 14:34:47.990591 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc1a270c-bc12-4d84-9482-b51a8db4be0d" containerName="mariadb-account-create" Sep 30 14:34:47 crc kubenswrapper[4936]: I0930 14:34:47.990620 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc1a270c-bc12-4d84-9482-b51a8db4be0d" containerName="mariadb-account-create" Sep 30 14:34:47 crc kubenswrapper[4936]: I0930 14:34:47.990833 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc1a270c-bc12-4d84-9482-b51a8db4be0d" containerName="mariadb-account-create" Sep 30 14:34:47 crc kubenswrapper[4936]: I0930 14:34:47.991698 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-2cqnc" Sep 30 14:34:47 crc kubenswrapper[4936]: I0930 14:34:47.996286 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Sep 30 14:34:47 crc kubenswrapper[4936]: I0930 14:34:47.996681 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-kxd95" Sep 30 14:34:48 crc kubenswrapper[4936]: I0930 14:34:48.006588 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-2cqnc"] Sep 30 14:34:48 crc kubenswrapper[4936]: I0930 14:34:48.070796 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc73413-5641-4845-a637-2430806cfa00-combined-ca-bundle\") pod \"manila-db-sync-2cqnc\" (UID: \"3dc73413-5641-4845-a637-2430806cfa00\") " pod="openstack/manila-db-sync-2cqnc" Sep 30 14:34:48 crc kubenswrapper[4936]: I0930 14:34:48.071425 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/3dc73413-5641-4845-a637-2430806cfa00-job-config-data\") pod \"manila-db-sync-2cqnc\" (UID: \"3dc73413-5641-4845-a637-2430806cfa00\") " pod="openstack/manila-db-sync-2cqnc" Sep 30 14:34:48 crc kubenswrapper[4936]: I0930 14:34:48.071468 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdpvf\" (UniqueName: \"kubernetes.io/projected/3dc73413-5641-4845-a637-2430806cfa00-kube-api-access-qdpvf\") pod \"manila-db-sync-2cqnc\" (UID: \"3dc73413-5641-4845-a637-2430806cfa00\") " pod="openstack/manila-db-sync-2cqnc" Sep 30 14:34:48 crc kubenswrapper[4936]: I0930 14:34:48.071518 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dc73413-5641-4845-a637-2430806cfa00-config-data\") pod \"manila-db-sync-2cqnc\" (UID: \"3dc73413-5641-4845-a637-2430806cfa00\") " pod="openstack/manila-db-sync-2cqnc" Sep 30 14:34:48 crc kubenswrapper[4936]: I0930 14:34:48.173467 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/3dc73413-5641-4845-a637-2430806cfa00-job-config-data\") pod \"manila-db-sync-2cqnc\" (UID: \"3dc73413-5641-4845-a637-2430806cfa00\") " pod="openstack/manila-db-sync-2cqnc" Sep 30 14:34:48 crc kubenswrapper[4936]: I0930 14:34:48.173543 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdpvf\" (UniqueName: \"kubernetes.io/projected/3dc73413-5641-4845-a637-2430806cfa00-kube-api-access-qdpvf\") pod \"manila-db-sync-2cqnc\" (UID: \"3dc73413-5641-4845-a637-2430806cfa00\") " pod="openstack/manila-db-sync-2cqnc" Sep 30 14:34:48 crc kubenswrapper[4936]: I0930 14:34:48.173592 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dc73413-5641-4845-a637-2430806cfa00-config-data\") pod \"manila-db-sync-2cqnc\" (UID: \"3dc73413-5641-4845-a637-2430806cfa00\") " pod="openstack/manila-db-sync-2cqnc" Sep 30 14:34:48 crc kubenswrapper[4936]: I0930 14:34:48.173701 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc73413-5641-4845-a637-2430806cfa00-combined-ca-bundle\") pod \"manila-db-sync-2cqnc\" (UID: \"3dc73413-5641-4845-a637-2430806cfa00\") " pod="openstack/manila-db-sync-2cqnc" Sep 30 14:34:48 crc kubenswrapper[4936]: I0930 14:34:48.179166 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc73413-5641-4845-a637-2430806cfa00-combined-ca-bundle\") pod \"manila-db-sync-2cqnc\" (UID: \"3dc73413-5641-4845-a637-2430806cfa00\") " pod="openstack/manila-db-sync-2cqnc" Sep 30 14:34:48 crc kubenswrapper[4936]: I0930 14:34:48.193051 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dc73413-5641-4845-a637-2430806cfa00-config-data\") pod \"manila-db-sync-2cqnc\" (UID: \"3dc73413-5641-4845-a637-2430806cfa00\") " pod="openstack/manila-db-sync-2cqnc" Sep 30 14:34:48 crc kubenswrapper[4936]: I0930 14:34:48.198794 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdpvf\" (UniqueName: \"kubernetes.io/projected/3dc73413-5641-4845-a637-2430806cfa00-kube-api-access-qdpvf\") pod \"manila-db-sync-2cqnc\" (UID: \"3dc73413-5641-4845-a637-2430806cfa00\") " pod="openstack/manila-db-sync-2cqnc" Sep 30 14:34:48 crc kubenswrapper[4936]: I0930 14:34:48.198986 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/3dc73413-5641-4845-a637-2430806cfa00-job-config-data\") pod \"manila-db-sync-2cqnc\" (UID: \"3dc73413-5641-4845-a637-2430806cfa00\") " pod="openstack/manila-db-sync-2cqnc" Sep 30 14:34:48 crc kubenswrapper[4936]: I0930 14:34:48.251945 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:34:48 crc kubenswrapper[4936]: I0930 14:34:48.252266 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:34:48 crc kubenswrapper[4936]: I0930 14:34:48.322370 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-2cqnc" Sep 30 14:34:49 crc kubenswrapper[4936]: I0930 14:34:49.286771 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-2cqnc"] Sep 30 14:34:49 crc kubenswrapper[4936]: I0930 14:34:49.750953 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-2cqnc" event={"ID":"3dc73413-5641-4845-a637-2430806cfa00","Type":"ContainerStarted","Data":"1fbb76d6162387d02edc16186003f169f92b81347c551064a7d47732653b7e0e"} Sep 30 14:34:56 crc kubenswrapper[4936]: I0930 14:34:56.843875 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-2cqnc" event={"ID":"3dc73413-5641-4845-a637-2430806cfa00","Type":"ContainerStarted","Data":"7ebc22322b653d1bf6e19c441532fa1d415aac44347dba3603b526279c501a8c"} Sep 30 14:34:56 crc kubenswrapper[4936]: I0930 14:34:56.888753 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-2cqnc" podStartSLOduration=3.401440558 podStartE2EDuration="9.888732726s" podCreationTimestamp="2025-09-30 14:34:47 +0000 UTC" firstStartedPulling="2025-09-30 14:34:49.296467299 +0000 UTC m=+3339.680469600" lastFinishedPulling="2025-09-30 14:34:55.783759467 +0000 UTC m=+3346.167761768" observedRunningTime="2025-09-30 14:34:56.875109752 +0000 UTC m=+3347.259112053" watchObservedRunningTime="2025-09-30 14:34:56.888732726 +0000 UTC m=+3347.272735027" Sep 30 14:34:57 crc kubenswrapper[4936]: I0930 14:34:57.042878 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7c45676df6-k4rk6" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.246:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.246:8443: connect: connection refused" Sep 30 14:34:57 crc kubenswrapper[4936]: I0930 14:34:57.482664 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b866fc884-w2td6" podUID="1e28ad1d-adf7-4316-9df6-db8a7c1e3933" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.247:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.247:8443: connect: connection refused" Sep 30 14:35:07 crc kubenswrapper[4936]: I0930 14:35:07.042616 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7c45676df6-k4rk6" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.246:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.246:8443: connect: connection refused" Sep 30 14:35:07 crc kubenswrapper[4936]: I0930 14:35:07.479865 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b866fc884-w2td6" podUID="1e28ad1d-adf7-4316-9df6-db8a7c1e3933" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.247:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.247:8443: connect: connection refused" Sep 30 14:35:08 crc kubenswrapper[4936]: I0930 14:35:08.951934 4936 generic.go:334] "Generic (PLEG): container finished" podID="3dc73413-5641-4845-a637-2430806cfa00" containerID="7ebc22322b653d1bf6e19c441532fa1d415aac44347dba3603b526279c501a8c" exitCode=0 Sep 30 14:35:08 crc kubenswrapper[4936]: I0930 14:35:08.952423 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-2cqnc" event={"ID":"3dc73413-5641-4845-a637-2430806cfa00","Type":"ContainerDied","Data":"7ebc22322b653d1bf6e19c441532fa1d415aac44347dba3603b526279c501a8c"} Sep 30 14:35:10 crc kubenswrapper[4936]: I0930 14:35:10.461867 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-2cqnc" Sep 30 14:35:10 crc kubenswrapper[4936]: I0930 14:35:10.575515 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc73413-5641-4845-a637-2430806cfa00-combined-ca-bundle\") pod \"3dc73413-5641-4845-a637-2430806cfa00\" (UID: \"3dc73413-5641-4845-a637-2430806cfa00\") " Sep 30 14:35:10 crc kubenswrapper[4936]: I0930 14:35:10.575754 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/3dc73413-5641-4845-a637-2430806cfa00-job-config-data\") pod \"3dc73413-5641-4845-a637-2430806cfa00\" (UID: \"3dc73413-5641-4845-a637-2430806cfa00\") " Sep 30 14:35:10 crc kubenswrapper[4936]: I0930 14:35:10.575786 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dc73413-5641-4845-a637-2430806cfa00-config-data\") pod \"3dc73413-5641-4845-a637-2430806cfa00\" (UID: \"3dc73413-5641-4845-a637-2430806cfa00\") " Sep 30 14:35:10 crc kubenswrapper[4936]: I0930 14:35:10.575823 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdpvf\" (UniqueName: \"kubernetes.io/projected/3dc73413-5641-4845-a637-2430806cfa00-kube-api-access-qdpvf\") pod \"3dc73413-5641-4845-a637-2430806cfa00\" (UID: \"3dc73413-5641-4845-a637-2430806cfa00\") " Sep 30 14:35:10 crc kubenswrapper[4936]: I0930 14:35:10.589698 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dc73413-5641-4845-a637-2430806cfa00-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "3dc73413-5641-4845-a637-2430806cfa00" (UID: "3dc73413-5641-4845-a637-2430806cfa00"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:10 crc kubenswrapper[4936]: I0930 14:35:10.589877 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dc73413-5641-4845-a637-2430806cfa00-kube-api-access-qdpvf" (OuterVolumeSpecName: "kube-api-access-qdpvf") pod "3dc73413-5641-4845-a637-2430806cfa00" (UID: "3dc73413-5641-4845-a637-2430806cfa00"). InnerVolumeSpecName "kube-api-access-qdpvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:35:10 crc kubenswrapper[4936]: I0930 14:35:10.593673 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dc73413-5641-4845-a637-2430806cfa00-config-data" (OuterVolumeSpecName: "config-data") pod "3dc73413-5641-4845-a637-2430806cfa00" (UID: "3dc73413-5641-4845-a637-2430806cfa00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:10 crc kubenswrapper[4936]: I0930 14:35:10.613183 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dc73413-5641-4845-a637-2430806cfa00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3dc73413-5641-4845-a637-2430806cfa00" (UID: "3dc73413-5641-4845-a637-2430806cfa00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:10 crc kubenswrapper[4936]: I0930 14:35:10.678521 4936 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/3dc73413-5641-4845-a637-2430806cfa00-job-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:10 crc kubenswrapper[4936]: I0930 14:35:10.678567 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dc73413-5641-4845-a637-2430806cfa00-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:10 crc kubenswrapper[4936]: I0930 14:35:10.678581 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdpvf\" (UniqueName: \"kubernetes.io/projected/3dc73413-5641-4845-a637-2430806cfa00-kube-api-access-qdpvf\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:10 crc kubenswrapper[4936]: I0930 14:35:10.678596 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc73413-5641-4845-a637-2430806cfa00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:10 crc kubenswrapper[4936]: I0930 14:35:10.971602 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-2cqnc" event={"ID":"3dc73413-5641-4845-a637-2430806cfa00","Type":"ContainerDied","Data":"1fbb76d6162387d02edc16186003f169f92b81347c551064a7d47732653b7e0e"} Sep 30 14:35:10 crc kubenswrapper[4936]: I0930 14:35:10.971635 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-2cqnc" Sep 30 14:35:10 crc kubenswrapper[4936]: I0930 14:35:10.971643 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fbb76d6162387d02edc16186003f169f92b81347c551064a7d47732653b7e0e" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.246791 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Sep 30 14:35:11 crc kubenswrapper[4936]: E0930 14:35:11.247221 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc73413-5641-4845-a637-2430806cfa00" containerName="manila-db-sync" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.247238 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc73413-5641-4845-a637-2430806cfa00" containerName="manila-db-sync" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.247434 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dc73413-5641-4845-a637-2430806cfa00" containerName="manila-db-sync" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.248451 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.252794 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-kxd95" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.255322 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.255534 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.255659 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.281094 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.283059 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.285847 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.301416 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.310712 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.389958 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.389997 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.390026 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.390161 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.390193 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-config-data\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.390206 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.390233 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-ceph\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.390249 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.390266 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-config-data\") pod \"manila-scheduler-0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.390295 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdvv6\" (UniqueName: \"kubernetes.io/projected/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-kube-api-access-fdvv6\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.390314 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.390468 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-scripts\") pod \"manila-scheduler-0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.393023 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64j2d\" (UniqueName: \"kubernetes.io/projected/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-kube-api-access-64j2d\") pod \"manila-scheduler-0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.393407 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-scripts\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.453626 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77766fdf55-rzbvr"] Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.455303 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.466242 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77766fdf55-rzbvr"] Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.496055 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-scripts\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.496108 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.496134 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.496167 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.496373 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.496405 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-config-data\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.496452 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.496483 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-ceph\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.496501 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.496550 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-config-data\") pod \"manila-scheduler-0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.496584 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdvv6\" (UniqueName: \"kubernetes.io/projected/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-kube-api-access-fdvv6\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.496715 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.496793 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-scripts\") pod \"manila-scheduler-0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.496827 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64j2d\" (UniqueName: \"kubernetes.io/projected/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-kube-api-access-64j2d\") pod \"manila-scheduler-0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.501936 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.513713 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.514045 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.517893 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.525747 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-config-data\") pod \"manila-scheduler-0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.527373 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64j2d\" (UniqueName: \"kubernetes.io/projected/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-kube-api-access-64j2d\") pod \"manila-scheduler-0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.535038 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.536114 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.539900 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-scripts\") pod \"manila-scheduler-0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.540032 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-scripts\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.540060 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.554495 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-config-data\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.559145 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdvv6\" (UniqueName: \"kubernetes.io/projected/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-kube-api-access-fdvv6\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.604171 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2wdw\" (UniqueName: \"kubernetes.io/projected/e1481c32-550d-4129-856d-8bc79389c0d3-kube-api-access-f2wdw\") pod \"dnsmasq-dns-77766fdf55-rzbvr\" (UID: \"e1481c32-550d-4129-856d-8bc79389c0d3\") " pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.604616 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1481c32-550d-4129-856d-8bc79389c0d3-ovsdbserver-nb\") pod \"dnsmasq-dns-77766fdf55-rzbvr\" (UID: \"e1481c32-550d-4129-856d-8bc79389c0d3\") " pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.605254 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1481c32-550d-4129-856d-8bc79389c0d3-ovsdbserver-sb\") pod \"dnsmasq-dns-77766fdf55-rzbvr\" (UID: \"e1481c32-550d-4129-856d-8bc79389c0d3\") " pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.609452 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1481c32-550d-4129-856d-8bc79389c0d3-config\") pod \"dnsmasq-dns-77766fdf55-rzbvr\" (UID: \"e1481c32-550d-4129-856d-8bc79389c0d3\") " pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.609881 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1481c32-550d-4129-856d-8bc79389c0d3-dns-svc\") pod \"dnsmasq-dns-77766fdf55-rzbvr\" (UID: \"e1481c32-550d-4129-856d-8bc79389c0d3\") " pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.617537 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e1481c32-550d-4129-856d-8bc79389c0d3-openstack-edpm-ipam\") pod \"dnsmasq-dns-77766fdf55-rzbvr\" (UID: \"e1481c32-550d-4129-856d-8bc79389c0d3\") " pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.616134 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-ceph\") pod \"manila-share-share1-0\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.604911 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.666755 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.671469 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.682042 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.720559 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2wdw\" (UniqueName: \"kubernetes.io/projected/e1481c32-550d-4129-856d-8bc79389c0d3-kube-api-access-f2wdw\") pod \"dnsmasq-dns-77766fdf55-rzbvr\" (UID: \"e1481c32-550d-4129-856d-8bc79389c0d3\") " pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.721000 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1481c32-550d-4129-856d-8bc79389c0d3-ovsdbserver-nb\") pod \"dnsmasq-dns-77766fdf55-rzbvr\" (UID: \"e1481c32-550d-4129-856d-8bc79389c0d3\") " pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.721117 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1481c32-550d-4129-856d-8bc79389c0d3-ovsdbserver-sb\") pod \"dnsmasq-dns-77766fdf55-rzbvr\" (UID: \"e1481c32-550d-4129-856d-8bc79389c0d3\") " pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.721251 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1481c32-550d-4129-856d-8bc79389c0d3-config\") pod \"dnsmasq-dns-77766fdf55-rzbvr\" (UID: \"e1481c32-550d-4129-856d-8bc79389c0d3\") " pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.722011 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1481c32-550d-4129-856d-8bc79389c0d3-ovsdbserver-nb\") pod \"dnsmasq-dns-77766fdf55-rzbvr\" (UID: \"e1481c32-550d-4129-856d-8bc79389c0d3\") " pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.728448 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.737758 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1481c32-550d-4129-856d-8bc79389c0d3-dns-svc\") pod \"dnsmasq-dns-77766fdf55-rzbvr\" (UID: \"e1481c32-550d-4129-856d-8bc79389c0d3\") " pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.738068 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e1481c32-550d-4129-856d-8bc79389c0d3-openstack-edpm-ipam\") pod \"dnsmasq-dns-77766fdf55-rzbvr\" (UID: \"e1481c32-550d-4129-856d-8bc79389c0d3\") " pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.739849 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e1481c32-550d-4129-856d-8bc79389c0d3-openstack-edpm-ipam\") pod \"dnsmasq-dns-77766fdf55-rzbvr\" (UID: \"e1481c32-550d-4129-856d-8bc79389c0d3\") " pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.740472 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1481c32-550d-4129-856d-8bc79389c0d3-config\") pod \"dnsmasq-dns-77766fdf55-rzbvr\" (UID: \"e1481c32-550d-4129-856d-8bc79389c0d3\") " pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.740656 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1481c32-550d-4129-856d-8bc79389c0d3-ovsdbserver-sb\") pod \"dnsmasq-dns-77766fdf55-rzbvr\" (UID: \"e1481c32-550d-4129-856d-8bc79389c0d3\") " pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.742035 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1481c32-550d-4129-856d-8bc79389c0d3-dns-svc\") pod \"dnsmasq-dns-77766fdf55-rzbvr\" (UID: \"e1481c32-550d-4129-856d-8bc79389c0d3\") " pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.775787 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2wdw\" (UniqueName: \"kubernetes.io/projected/e1481c32-550d-4129-856d-8bc79389c0d3-kube-api-access-f2wdw\") pod \"dnsmasq-dns-77766fdf55-rzbvr\" (UID: \"e1481c32-550d-4129-856d-8bc79389c0d3\") " pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.802984 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.844693 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krptm\" (UniqueName: \"kubernetes.io/projected/f2682bfb-8a96-44ea-8e7d-707de889dd1c-kube-api-access-krptm\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.845137 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2682bfb-8a96-44ea-8e7d-707de889dd1c-etc-machine-id\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.845209 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.845228 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-scripts\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.845276 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2682bfb-8a96-44ea-8e7d-707de889dd1c-logs\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.845298 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-config-data-custom\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.845314 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-config-data\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.870833 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.949674 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2682bfb-8a96-44ea-8e7d-707de889dd1c-etc-machine-id\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.949788 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.949818 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-scripts\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.949866 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2682bfb-8a96-44ea-8e7d-707de889dd1c-logs\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.949885 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-config-data-custom\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.949906 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-config-data\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.949969 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krptm\" (UniqueName: \"kubernetes.io/projected/f2682bfb-8a96-44ea-8e7d-707de889dd1c-kube-api-access-krptm\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.950419 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2682bfb-8a96-44ea-8e7d-707de889dd1c-etc-machine-id\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.952406 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2682bfb-8a96-44ea-8e7d-707de889dd1c-logs\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.957649 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.958551 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-config-data-custom\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.960389 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-config-data\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.963906 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-scripts\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:11 crc kubenswrapper[4936]: I0930 14:35:11.978035 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krptm\" (UniqueName: \"kubernetes.io/projected/f2682bfb-8a96-44ea-8e7d-707de889dd1c-kube-api-access-krptm\") pod \"manila-api-0\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " pod="openstack/manila-api-0" Sep 30 14:35:12 crc kubenswrapper[4936]: I0930 14:35:12.214957 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Sep 30 14:35:12 crc kubenswrapper[4936]: I0930 14:35:12.570999 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Sep 30 14:35:12 crc kubenswrapper[4936]: I0930 14:35:12.785080 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77766fdf55-rzbvr"] Sep 30 14:35:12 crc kubenswrapper[4936]: W0930 14:35:12.820839 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1481c32_550d_4129_856d_8bc79389c0d3.slice/crio-16a75c8d0f0811221b47e2e814fddb9040ed016ca36560d7129e17f199f36cfe WatchSource:0}: Error finding container 16a75c8d0f0811221b47e2e814fddb9040ed016ca36560d7129e17f199f36cfe: Status 404 returned error can't find the container with id 16a75c8d0f0811221b47e2e814fddb9040ed016ca36560d7129e17f199f36cfe Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.037735 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.047381 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0","Type":"ContainerStarted","Data":"fc656b4d7a546b7b95831f52ec8bc67aac22a73d191717e0894d1f6a08c16ee5"} Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.048826 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" event={"ID":"e1481c32-550d-4129-856d-8bc79389c0d3","Type":"ContainerStarted","Data":"16a75c8d0f0811221b47e2e814fddb9040ed016ca36560d7129e17f199f36cfe"} Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.056661 4936 generic.go:334] "Generic (PLEG): container finished" podID="357dcb07-796d-420d-b883-5304f74c724c" containerID="e9ed93be242f7bcef6328891298d921864531c9ecf8b9fcbbcb667c21b35460b" exitCode=137 Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.056703 4936 generic.go:334] "Generic (PLEG): container finished" podID="357dcb07-796d-420d-b883-5304f74c724c" containerID="76ed92d790453891ddab98946c7ab4f83ce726c2abd2382c0dfabe49ccbf0a38" exitCode=137 Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.056753 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76ccbbb9dc-7twmk" event={"ID":"357dcb07-796d-420d-b883-5304f74c724c","Type":"ContainerDied","Data":"e9ed93be242f7bcef6328891298d921864531c9ecf8b9fcbbcb667c21b35460b"} Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.056785 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76ccbbb9dc-7twmk" event={"ID":"357dcb07-796d-420d-b883-5304f74c724c","Type":"ContainerDied","Data":"76ed92d790453891ddab98946c7ab4f83ce726c2abd2382c0dfabe49ccbf0a38"} Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.063427 4936 generic.go:334] "Generic (PLEG): container finished" podID="fcc04d7d-2a45-41e7-8c63-883df94fcd08" containerID="afa56e03b5f69f83a340420ae242cad8453c2abdf1af58861ad46719f0534486" exitCode=137 Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.063463 4936 generic.go:334] "Generic (PLEG): container finished" podID="fcc04d7d-2a45-41e7-8c63-883df94fcd08" containerID="806c9e049c2e75f59ca8cce527d23de3a71feff91f0cdfcbbab2b23fa1adeb5f" exitCode=137 Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.063482 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79464b7c69-cqz8d" event={"ID":"fcc04d7d-2a45-41e7-8c63-883df94fcd08","Type":"ContainerDied","Data":"afa56e03b5f69f83a340420ae242cad8453c2abdf1af58861ad46719f0534486"} Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.063542 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79464b7c69-cqz8d" event={"ID":"fcc04d7d-2a45-41e7-8c63-883df94fcd08","Type":"ContainerDied","Data":"806c9e049c2e75f59ca8cce527d23de3a71feff91f0cdfcbbab2b23fa1adeb5f"} Sep 30 14:35:13 crc kubenswrapper[4936]: W0930 14:35:13.159302 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bb49504_7154_45ef_bd12_9ce1d3d1cf33.slice/crio-d13a1bb09fcc6452b3b25042f4cbccb0971a4a298444f97a2a1108e6274bc08b WatchSource:0}: Error finding container d13a1bb09fcc6452b3b25042f4cbccb0971a4a298444f97a2a1108e6274bc08b: Status 404 returned error can't find the container with id d13a1bb09fcc6452b3b25042f4cbccb0971a4a298444f97a2a1108e6274bc08b Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.274587 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.824088 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76ccbbb9dc-7twmk" Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.832657 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79464b7c69-cqz8d" Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.897142 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/357dcb07-796d-420d-b883-5304f74c724c-horizon-secret-key\") pod \"357dcb07-796d-420d-b883-5304f74c724c\" (UID: \"357dcb07-796d-420d-b883-5304f74c724c\") " Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.897207 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b275z\" (UniqueName: \"kubernetes.io/projected/357dcb07-796d-420d-b883-5304f74c724c-kube-api-access-b275z\") pod \"357dcb07-796d-420d-b883-5304f74c724c\" (UID: \"357dcb07-796d-420d-b883-5304f74c724c\") " Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.897425 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/357dcb07-796d-420d-b883-5304f74c724c-config-data\") pod \"357dcb07-796d-420d-b883-5304f74c724c\" (UID: \"357dcb07-796d-420d-b883-5304f74c724c\") " Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.897462 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fcc04d7d-2a45-41e7-8c63-883df94fcd08-config-data\") pod \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\" (UID: \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\") " Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.897589 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/357dcb07-796d-420d-b883-5304f74c724c-scripts\") pod \"357dcb07-796d-420d-b883-5304f74c724c\" (UID: \"357dcb07-796d-420d-b883-5304f74c724c\") " Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.897667 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fcc04d7d-2a45-41e7-8c63-883df94fcd08-horizon-secret-key\") pod \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\" (UID: \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\") " Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.897697 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcc04d7d-2a45-41e7-8c63-883df94fcd08-scripts\") pod \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\" (UID: \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\") " Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.897774 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcc04d7d-2a45-41e7-8c63-883df94fcd08-logs\") pod \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\" (UID: \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\") " Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.897834 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/357dcb07-796d-420d-b883-5304f74c724c-logs\") pod \"357dcb07-796d-420d-b883-5304f74c724c\" (UID: \"357dcb07-796d-420d-b883-5304f74c724c\") " Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.897902 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-756tm\" (UniqueName: \"kubernetes.io/projected/fcc04d7d-2a45-41e7-8c63-883df94fcd08-kube-api-access-756tm\") pod \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\" (UID: \"fcc04d7d-2a45-41e7-8c63-883df94fcd08\") " Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.906614 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/357dcb07-796d-420d-b883-5304f74c724c-kube-api-access-b275z" (OuterVolumeSpecName: "kube-api-access-b275z") pod "357dcb07-796d-420d-b883-5304f74c724c" (UID: "357dcb07-796d-420d-b883-5304f74c724c"). InnerVolumeSpecName "kube-api-access-b275z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.907297 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/357dcb07-796d-420d-b883-5304f74c724c-logs" (OuterVolumeSpecName: "logs") pod "357dcb07-796d-420d-b883-5304f74c724c" (UID: "357dcb07-796d-420d-b883-5304f74c724c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.907401 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcc04d7d-2a45-41e7-8c63-883df94fcd08-logs" (OuterVolumeSpecName: "logs") pod "fcc04d7d-2a45-41e7-8c63-883df94fcd08" (UID: "fcc04d7d-2a45-41e7-8c63-883df94fcd08"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.914212 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/357dcb07-796d-420d-b883-5304f74c724c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "357dcb07-796d-420d-b883-5304f74c724c" (UID: "357dcb07-796d-420d-b883-5304f74c724c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.918753 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcc04d7d-2a45-41e7-8c63-883df94fcd08-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "fcc04d7d-2a45-41e7-8c63-883df94fcd08" (UID: "fcc04d7d-2a45-41e7-8c63-883df94fcd08"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:13 crc kubenswrapper[4936]: I0930 14:35:13.955082 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcc04d7d-2a45-41e7-8c63-883df94fcd08-kube-api-access-756tm" (OuterVolumeSpecName: "kube-api-access-756tm") pod "fcc04d7d-2a45-41e7-8c63-883df94fcd08" (UID: "fcc04d7d-2a45-41e7-8c63-883df94fcd08"). InnerVolumeSpecName "kube-api-access-756tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.015473 4936 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fcc04d7d-2a45-41e7-8c63-883df94fcd08-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.015512 4936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcc04d7d-2a45-41e7-8c63-883df94fcd08-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.015527 4936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/357dcb07-796d-420d-b883-5304f74c724c-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.015536 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-756tm\" (UniqueName: \"kubernetes.io/projected/fcc04d7d-2a45-41e7-8c63-883df94fcd08-kube-api-access-756tm\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.015551 4936 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/357dcb07-796d-420d-b883-5304f74c724c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.015565 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b275z\" (UniqueName: \"kubernetes.io/projected/357dcb07-796d-420d-b883-5304f74c724c-kube-api-access-b275z\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.045030 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/357dcb07-796d-420d-b883-5304f74c724c-scripts" (OuterVolumeSpecName: "scripts") pod "357dcb07-796d-420d-b883-5304f74c724c" (UID: "357dcb07-796d-420d-b883-5304f74c724c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.101672 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/357dcb07-796d-420d-b883-5304f74c724c-config-data" (OuterVolumeSpecName: "config-data") pod "357dcb07-796d-420d-b883-5304f74c724c" (UID: "357dcb07-796d-420d-b883-5304f74c724c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.102055 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcc04d7d-2a45-41e7-8c63-883df94fcd08-config-data" (OuterVolumeSpecName: "config-data") pod "fcc04d7d-2a45-41e7-8c63-883df94fcd08" (UID: "fcc04d7d-2a45-41e7-8c63-883df94fcd08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.114267 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79464b7c69-cqz8d" event={"ID":"fcc04d7d-2a45-41e7-8c63-883df94fcd08","Type":"ContainerDied","Data":"b4dcd2925ae384002521e9668ed98cda0217a4dd5903d40d4af8918f85073c1c"} Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.114349 4936 scope.go:117] "RemoveContainer" containerID="afa56e03b5f69f83a340420ae242cad8453c2abdf1af58861ad46719f0534486" Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.114442 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79464b7c69-cqz8d" Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.119416 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/357dcb07-796d-420d-b883-5304f74c724c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.119439 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fcc04d7d-2a45-41e7-8c63-883df94fcd08-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.119448 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/357dcb07-796d-420d-b883-5304f74c724c-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.120431 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8bb49504-7154-45ef-bd12-9ce1d3d1cf33","Type":"ContainerStarted","Data":"d13a1bb09fcc6452b3b25042f4cbccb0971a4a298444f97a2a1108e6274bc08b"} Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.130112 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcc04d7d-2a45-41e7-8c63-883df94fcd08-scripts" (OuterVolumeSpecName: "scripts") pod "fcc04d7d-2a45-41e7-8c63-883df94fcd08" (UID: "fcc04d7d-2a45-41e7-8c63-883df94fcd08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.131122 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f2682bfb-8a96-44ea-8e7d-707de889dd1c","Type":"ContainerStarted","Data":"fd592dcd241ff27b502638ba4fab2a95eb0cf279fded3c229de5cef53bb02596"} Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.141913 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76ccbbb9dc-7twmk" event={"ID":"357dcb07-796d-420d-b883-5304f74c724c","Type":"ContainerDied","Data":"424549dc6d1659968c92fc5bf7d47d7d2b896c7fef88e89b5216701ed82ba63a"} Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.142049 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76ccbbb9dc-7twmk" Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.221663 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcc04d7d-2a45-41e7-8c63-883df94fcd08-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.256589 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76ccbbb9dc-7twmk"] Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.275251 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-76ccbbb9dc-7twmk"] Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.336100 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="357dcb07-796d-420d-b883-5304f74c724c" path="/var/lib/kubelet/pods/357dcb07-796d-420d-b883-5304f74c724c/volumes" Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.500328 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79464b7c69-cqz8d"] Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.517266 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-79464b7c69-cqz8d"] Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.612901 4936 scope.go:117] "RemoveContainer" containerID="806c9e049c2e75f59ca8cce527d23de3a71feff91f0cdfcbbab2b23fa1adeb5f" Sep 30 14:35:14 crc kubenswrapper[4936]: I0930 14:35:14.963994 4936 scope.go:117] "RemoveContainer" containerID="e9ed93be242f7bcef6328891298d921864531c9ecf8b9fcbbcb667c21b35460b" Sep 30 14:35:15 crc kubenswrapper[4936]: I0930 14:35:15.160746 4936 generic.go:334] "Generic (PLEG): container finished" podID="e1481c32-550d-4129-856d-8bc79389c0d3" containerID="ca2cc11b99c03f44c03bcfe809dda72b098a69d6fae4eda95116bc8b14b22821" exitCode=0 Sep 30 14:35:15 crc kubenswrapper[4936]: I0930 14:35:15.161186 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" event={"ID":"e1481c32-550d-4129-856d-8bc79389c0d3","Type":"ContainerDied","Data":"ca2cc11b99c03f44c03bcfe809dda72b098a69d6fae4eda95116bc8b14b22821"} Sep 30 14:35:15 crc kubenswrapper[4936]: I0930 14:35:15.192429 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0","Type":"ContainerStarted","Data":"dd4707b63739c1ee8605f50555a7aac8746f8b89ad1eb634d1772040c33ab293"} Sep 30 14:35:15 crc kubenswrapper[4936]: I0930 14:35:15.211275 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f2682bfb-8a96-44ea-8e7d-707de889dd1c","Type":"ContainerStarted","Data":"0a80b500739db82c2f96c45689d1ee97158ac7ed3007c8c2bd41537065e99212"} Sep 30 14:35:15 crc kubenswrapper[4936]: I0930 14:35:15.449200 4936 scope.go:117] "RemoveContainer" containerID="76ed92d790453891ddab98946c7ab4f83ce726c2abd2382c0dfabe49ccbf0a38" Sep 30 14:35:16 crc kubenswrapper[4936]: I0930 14:35:16.224557 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0","Type":"ContainerStarted","Data":"6565d8446c76e22d8ff127d651af2f997a103c104891862cda2329f3f3672eda"} Sep 30 14:35:16 crc kubenswrapper[4936]: I0930 14:35:16.232961 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f2682bfb-8a96-44ea-8e7d-707de889dd1c","Type":"ContainerStarted","Data":"7f8185da882a522066b99a08a5f8b76700b23ff9f8e1b854841e4539d67d14db"} Sep 30 14:35:16 crc kubenswrapper[4936]: I0930 14:35:16.233074 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Sep 30 14:35:16 crc kubenswrapper[4936]: I0930 14:35:16.236565 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" event={"ID":"e1481c32-550d-4129-856d-8bc79389c0d3","Type":"ContainerStarted","Data":"444a697c110d8ec9cc5c59c6b993f28ef997e5fc289fafcd47f0b235871ba750"} Sep 30 14:35:16 crc kubenswrapper[4936]: I0930 14:35:16.236827 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:16 crc kubenswrapper[4936]: I0930 14:35:16.249702 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.367040686 podStartE2EDuration="5.249677466s" podCreationTimestamp="2025-09-30 14:35:11 +0000 UTC" firstStartedPulling="2025-09-30 14:35:12.582284855 +0000 UTC m=+3362.966287156" lastFinishedPulling="2025-09-30 14:35:13.464921635 +0000 UTC m=+3363.848923936" observedRunningTime="2025-09-30 14:35:16.245203113 +0000 UTC m=+3366.629205414" watchObservedRunningTime="2025-09-30 14:35:16.249677466 +0000 UTC m=+3366.633679787" Sep 30 14:35:16 crc kubenswrapper[4936]: I0930 14:35:16.277343 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" podStartSLOduration=5.277319605 podStartE2EDuration="5.277319605s" podCreationTimestamp="2025-09-30 14:35:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:35:16.269557472 +0000 UTC m=+3366.653559783" watchObservedRunningTime="2025-09-30 14:35:16.277319605 +0000 UTC m=+3366.661321906" Sep 30 14:35:16 crc kubenswrapper[4936]: I0930 14:35:16.291328 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=5.291295279 podStartE2EDuration="5.291295279s" podCreationTimestamp="2025-09-30 14:35:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:35:16.287973388 +0000 UTC m=+3366.671975709" watchObservedRunningTime="2025-09-30 14:35:16.291295279 +0000 UTC m=+3366.675297590" Sep 30 14:35:16 crc kubenswrapper[4936]: I0930 14:35:16.330744 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcc04d7d-2a45-41e7-8c63-883df94fcd08" path="/var/lib/kubelet/pods/fcc04d7d-2a45-41e7-8c63-883df94fcd08/volumes" Sep 30 14:35:16 crc kubenswrapper[4936]: I0930 14:35:16.528953 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Sep 30 14:35:18 crc kubenswrapper[4936]: I0930 14:35:18.251164 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:35:18 crc kubenswrapper[4936]: I0930 14:35:18.252458 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:35:18 crc kubenswrapper[4936]: I0930 14:35:18.267593 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="f2682bfb-8a96-44ea-8e7d-707de889dd1c" containerName="manila-api" containerID="cri-o://7f8185da882a522066b99a08a5f8b76700b23ff9f8e1b854841e4539d67d14db" gracePeriod=30 Sep 30 14:35:18 crc kubenswrapper[4936]: I0930 14:35:18.267806 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="f2682bfb-8a96-44ea-8e7d-707de889dd1c" containerName="manila-api-log" containerID="cri-o://0a80b500739db82c2f96c45689d1ee97158ac7ed3007c8c2bd41537065e99212" gracePeriod=30 Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.240048 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.282079 4936 generic.go:334] "Generic (PLEG): container finished" podID="f2682bfb-8a96-44ea-8e7d-707de889dd1c" containerID="7f8185da882a522066b99a08a5f8b76700b23ff9f8e1b854841e4539d67d14db" exitCode=0 Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.282127 4936 generic.go:334] "Generic (PLEG): container finished" podID="f2682bfb-8a96-44ea-8e7d-707de889dd1c" containerID="0a80b500739db82c2f96c45689d1ee97158ac7ed3007c8c2bd41537065e99212" exitCode=143 Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.282158 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f2682bfb-8a96-44ea-8e7d-707de889dd1c","Type":"ContainerDied","Data":"7f8185da882a522066b99a08a5f8b76700b23ff9f8e1b854841e4539d67d14db"} Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.282208 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f2682bfb-8a96-44ea-8e7d-707de889dd1c","Type":"ContainerDied","Data":"0a80b500739db82c2f96c45689d1ee97158ac7ed3007c8c2bd41537065e99212"} Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.282221 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f2682bfb-8a96-44ea-8e7d-707de889dd1c","Type":"ContainerDied","Data":"fd592dcd241ff27b502638ba4fab2a95eb0cf279fded3c229de5cef53bb02596"} Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.282241 4936 scope.go:117] "RemoveContainer" containerID="7f8185da882a522066b99a08a5f8b76700b23ff9f8e1b854841e4539d67d14db" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.282471 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.334756 4936 scope.go:117] "RemoveContainer" containerID="0a80b500739db82c2f96c45689d1ee97158ac7ed3007c8c2bd41537065e99212" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.368956 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2682bfb-8a96-44ea-8e7d-707de889dd1c-etc-machine-id\") pod \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.369046 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krptm\" (UniqueName: \"kubernetes.io/projected/f2682bfb-8a96-44ea-8e7d-707de889dd1c-kube-api-access-krptm\") pod \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.369098 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-config-data-custom\") pod \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.369135 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-scripts\") pod \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.370319 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-combined-ca-bundle\") pod \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.370459 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2682bfb-8a96-44ea-8e7d-707de889dd1c-logs\") pod \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.370550 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-config-data\") pod \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\" (UID: \"f2682bfb-8a96-44ea-8e7d-707de889dd1c\") " Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.373951 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2682bfb-8a96-44ea-8e7d-707de889dd1c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f2682bfb-8a96-44ea-8e7d-707de889dd1c" (UID: "f2682bfb-8a96-44ea-8e7d-707de889dd1c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.375045 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2682bfb-8a96-44ea-8e7d-707de889dd1c-logs" (OuterVolumeSpecName: "logs") pod "f2682bfb-8a96-44ea-8e7d-707de889dd1c" (UID: "f2682bfb-8a96-44ea-8e7d-707de889dd1c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.376788 4936 scope.go:117] "RemoveContainer" containerID="7f8185da882a522066b99a08a5f8b76700b23ff9f8e1b854841e4539d67d14db" Sep 30 14:35:19 crc kubenswrapper[4936]: E0930 14:35:19.378467 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f8185da882a522066b99a08a5f8b76700b23ff9f8e1b854841e4539d67d14db\": container with ID starting with 7f8185da882a522066b99a08a5f8b76700b23ff9f8e1b854841e4539d67d14db not found: ID does not exist" containerID="7f8185da882a522066b99a08a5f8b76700b23ff9f8e1b854841e4539d67d14db" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.378644 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f8185da882a522066b99a08a5f8b76700b23ff9f8e1b854841e4539d67d14db"} err="failed to get container status \"7f8185da882a522066b99a08a5f8b76700b23ff9f8e1b854841e4539d67d14db\": rpc error: code = NotFound desc = could not find container \"7f8185da882a522066b99a08a5f8b76700b23ff9f8e1b854841e4539d67d14db\": container with ID starting with 7f8185da882a522066b99a08a5f8b76700b23ff9f8e1b854841e4539d67d14db not found: ID does not exist" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.378752 4936 scope.go:117] "RemoveContainer" containerID="0a80b500739db82c2f96c45689d1ee97158ac7ed3007c8c2bd41537065e99212" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.385393 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2682bfb-8a96-44ea-8e7d-707de889dd1c-kube-api-access-krptm" (OuterVolumeSpecName: "kube-api-access-krptm") pod "f2682bfb-8a96-44ea-8e7d-707de889dd1c" (UID: "f2682bfb-8a96-44ea-8e7d-707de889dd1c"). InnerVolumeSpecName "kube-api-access-krptm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:35:19 crc kubenswrapper[4936]: E0930 14:35:19.386219 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a80b500739db82c2f96c45689d1ee97158ac7ed3007c8c2bd41537065e99212\": container with ID starting with 0a80b500739db82c2f96c45689d1ee97158ac7ed3007c8c2bd41537065e99212 not found: ID does not exist" containerID="0a80b500739db82c2f96c45689d1ee97158ac7ed3007c8c2bd41537065e99212" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.386256 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a80b500739db82c2f96c45689d1ee97158ac7ed3007c8c2bd41537065e99212"} err="failed to get container status \"0a80b500739db82c2f96c45689d1ee97158ac7ed3007c8c2bd41537065e99212\": rpc error: code = NotFound desc = could not find container \"0a80b500739db82c2f96c45689d1ee97158ac7ed3007c8c2bd41537065e99212\": container with ID starting with 0a80b500739db82c2f96c45689d1ee97158ac7ed3007c8c2bd41537065e99212 not found: ID does not exist" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.386286 4936 scope.go:117] "RemoveContainer" containerID="7f8185da882a522066b99a08a5f8b76700b23ff9f8e1b854841e4539d67d14db" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.387216 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f8185da882a522066b99a08a5f8b76700b23ff9f8e1b854841e4539d67d14db"} err="failed to get container status \"7f8185da882a522066b99a08a5f8b76700b23ff9f8e1b854841e4539d67d14db\": rpc error: code = NotFound desc = could not find container \"7f8185da882a522066b99a08a5f8b76700b23ff9f8e1b854841e4539d67d14db\": container with ID starting with 7f8185da882a522066b99a08a5f8b76700b23ff9f8e1b854841e4539d67d14db not found: ID does not exist" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.387237 4936 scope.go:117] "RemoveContainer" containerID="0a80b500739db82c2f96c45689d1ee97158ac7ed3007c8c2bd41537065e99212" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.387576 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a80b500739db82c2f96c45689d1ee97158ac7ed3007c8c2bd41537065e99212"} err="failed to get container status \"0a80b500739db82c2f96c45689d1ee97158ac7ed3007c8c2bd41537065e99212\": rpc error: code = NotFound desc = could not find container \"0a80b500739db82c2f96c45689d1ee97158ac7ed3007c8c2bd41537065e99212\": container with ID starting with 0a80b500739db82c2f96c45689d1ee97158ac7ed3007c8c2bd41537065e99212 not found: ID does not exist" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.389328 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-scripts" (OuterVolumeSpecName: "scripts") pod "f2682bfb-8a96-44ea-8e7d-707de889dd1c" (UID: "f2682bfb-8a96-44ea-8e7d-707de889dd1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.402628 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f2682bfb-8a96-44ea-8e7d-707de889dd1c" (UID: "f2682bfb-8a96-44ea-8e7d-707de889dd1c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.429376 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2682bfb-8a96-44ea-8e7d-707de889dd1c" (UID: "f2682bfb-8a96-44ea-8e7d-707de889dd1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.475339 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.475390 4936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2682bfb-8a96-44ea-8e7d-707de889dd1c-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.475407 4936 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2682bfb-8a96-44ea-8e7d-707de889dd1c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.475417 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krptm\" (UniqueName: \"kubernetes.io/projected/f2682bfb-8a96-44ea-8e7d-707de889dd1c-kube-api-access-krptm\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.475431 4936 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.475439 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.497756 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-config-data" (OuterVolumeSpecName: "config-data") pod "f2682bfb-8a96-44ea-8e7d-707de889dd1c" (UID: "f2682bfb-8a96-44ea-8e7d-707de889dd1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.577997 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2682bfb-8a96-44ea-8e7d-707de889dd1c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.638083 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.713097 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.744323 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Sep 30 14:35:19 crc kubenswrapper[4936]: E0930 14:35:19.746068 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357dcb07-796d-420d-b883-5304f74c724c" containerName="horizon-log" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.746111 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="357dcb07-796d-420d-b883-5304f74c724c" containerName="horizon-log" Sep 30 14:35:19 crc kubenswrapper[4936]: E0930 14:35:19.746154 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2682bfb-8a96-44ea-8e7d-707de889dd1c" containerName="manila-api" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.746163 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2682bfb-8a96-44ea-8e7d-707de889dd1c" containerName="manila-api" Sep 30 14:35:19 crc kubenswrapper[4936]: E0930 14:35:19.746184 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc04d7d-2a45-41e7-8c63-883df94fcd08" containerName="horizon-log" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.746191 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc04d7d-2a45-41e7-8c63-883df94fcd08" containerName="horizon-log" Sep 30 14:35:19 crc kubenswrapper[4936]: E0930 14:35:19.746253 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357dcb07-796d-420d-b883-5304f74c724c" containerName="horizon" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.746262 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="357dcb07-796d-420d-b883-5304f74c724c" containerName="horizon" Sep 30 14:35:19 crc kubenswrapper[4936]: E0930 14:35:19.746279 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2682bfb-8a96-44ea-8e7d-707de889dd1c" containerName="manila-api-log" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.746290 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2682bfb-8a96-44ea-8e7d-707de889dd1c" containerName="manila-api-log" Sep 30 14:35:19 crc kubenswrapper[4936]: E0930 14:35:19.746320 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc04d7d-2a45-41e7-8c63-883df94fcd08" containerName="horizon" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.746329 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc04d7d-2a45-41e7-8c63-883df94fcd08" containerName="horizon" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.746812 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="357dcb07-796d-420d-b883-5304f74c724c" containerName="horizon" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.746852 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcc04d7d-2a45-41e7-8c63-883df94fcd08" containerName="horizon" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.746876 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcc04d7d-2a45-41e7-8c63-883df94fcd08" containerName="horizon-log" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.746905 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="357dcb07-796d-420d-b883-5304f74c724c" containerName="horizon-log" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.746929 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2682bfb-8a96-44ea-8e7d-707de889dd1c" containerName="manila-api-log" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.746952 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2682bfb-8a96-44ea-8e7d-707de889dd1c" containerName="manila-api" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.750021 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.755273 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.755659 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.759409 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.768838 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.790574 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4389a75-ea04-4a03-97df-6063474dd74e-config-data\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.790624 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4389a75-ea04-4a03-97df-6063474dd74e-etc-machine-id\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.790649 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4389a75-ea04-4a03-97df-6063474dd74e-public-tls-certs\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.790695 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4389a75-ea04-4a03-97df-6063474dd74e-scripts\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.790717 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4389a75-ea04-4a03-97df-6063474dd74e-logs\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.790752 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4389a75-ea04-4a03-97df-6063474dd74e-internal-tls-certs\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.790847 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4389a75-ea04-4a03-97df-6063474dd74e-config-data-custom\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.790868 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjrdc\" (UniqueName: \"kubernetes.io/projected/b4389a75-ea04-4a03-97df-6063474dd74e-kube-api-access-sjrdc\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.790900 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4389a75-ea04-4a03-97df-6063474dd74e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.892258 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4389a75-ea04-4a03-97df-6063474dd74e-config-data-custom\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.892533 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjrdc\" (UniqueName: \"kubernetes.io/projected/b4389a75-ea04-4a03-97df-6063474dd74e-kube-api-access-sjrdc\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.892635 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4389a75-ea04-4a03-97df-6063474dd74e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.892736 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4389a75-ea04-4a03-97df-6063474dd74e-config-data\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.892820 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4389a75-ea04-4a03-97df-6063474dd74e-etc-machine-id\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.892925 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4389a75-ea04-4a03-97df-6063474dd74e-public-tls-certs\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.893042 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4389a75-ea04-4a03-97df-6063474dd74e-scripts\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.893123 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4389a75-ea04-4a03-97df-6063474dd74e-logs\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.893196 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4389a75-ea04-4a03-97df-6063474dd74e-internal-tls-certs\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.895006 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4389a75-ea04-4a03-97df-6063474dd74e-etc-machine-id\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.903477 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4389a75-ea04-4a03-97df-6063474dd74e-logs\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.906767 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4389a75-ea04-4a03-97df-6063474dd74e-public-tls-certs\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.907041 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4389a75-ea04-4a03-97df-6063474dd74e-scripts\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.909006 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4389a75-ea04-4a03-97df-6063474dd74e-config-data\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.913437 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4389a75-ea04-4a03-97df-6063474dd74e-internal-tls-certs\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.918116 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4389a75-ea04-4a03-97df-6063474dd74e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.922869 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjrdc\" (UniqueName: \"kubernetes.io/projected/b4389a75-ea04-4a03-97df-6063474dd74e-kube-api-access-sjrdc\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:19 crc kubenswrapper[4936]: I0930 14:35:19.938244 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4389a75-ea04-4a03-97df-6063474dd74e-config-data-custom\") pod \"manila-api-0\" (UID: \"b4389a75-ea04-4a03-97df-6063474dd74e\") " pod="openstack/manila-api-0" Sep 30 14:35:20 crc kubenswrapper[4936]: I0930 14:35:20.107680 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Sep 30 14:35:20 crc kubenswrapper[4936]: I0930 14:35:20.333119 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2682bfb-8a96-44ea-8e7d-707de889dd1c" path="/var/lib/kubelet/pods/f2682bfb-8a96-44ea-8e7d-707de889dd1c/volumes" Sep 30 14:35:21 crc kubenswrapper[4936]: I0930 14:35:21.605518 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Sep 30 14:35:21 crc kubenswrapper[4936]: I0930 14:35:21.805498 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77766fdf55-rzbvr" Sep 30 14:35:21 crc kubenswrapper[4936]: I0930 14:35:21.903973 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb68d687f-6n4nt"] Sep 30 14:35:21 crc kubenswrapper[4936]: I0930 14:35:21.904371 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" podUID="1f86ab2e-56dc-4b7a-a3a3-b69b0922932e" containerName="dnsmasq-dns" containerID="cri-o://3c4879fc349049e199522214d8a42998ffe40ef53b5b6beacbf32c5d51253673" gracePeriod=10 Sep 30 14:35:22 crc kubenswrapper[4936]: I0930 14:35:22.064600 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7c45676df6-k4rk6" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.246:8443/dashboard/auth/login/?next=/dashboard/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 14:35:22 crc kubenswrapper[4936]: I0930 14:35:22.064699 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:35:22 crc kubenswrapper[4936]: I0930 14:35:22.065944 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"16366588f2d33c86e7f8823a43493c3e6b00d89b4c8628779ccffe5c17a02362"} pod="openstack/horizon-7c45676df6-k4rk6" containerMessage="Container horizon failed startup probe, will be restarted" Sep 30 14:35:22 crc kubenswrapper[4936]: I0930 14:35:22.065978 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c45676df6-k4rk6" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" containerID="cri-o://16366588f2d33c86e7f8823a43493c3e6b00d89b4c8628779ccffe5c17a02362" gracePeriod=30 Sep 30 14:35:22 crc kubenswrapper[4936]: I0930 14:35:22.355050 4936 generic.go:334] "Generic (PLEG): container finished" podID="1f86ab2e-56dc-4b7a-a3a3-b69b0922932e" containerID="3c4879fc349049e199522214d8a42998ffe40ef53b5b6beacbf32c5d51253673" exitCode=0 Sep 30 14:35:22 crc kubenswrapper[4936]: I0930 14:35:22.372828 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" event={"ID":"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e","Type":"ContainerDied","Data":"3c4879fc349049e199522214d8a42998ffe40ef53b5b6beacbf32c5d51253673"} Sep 30 14:35:22 crc kubenswrapper[4936]: I0930 14:35:22.488533 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b866fc884-w2td6" podUID="1e28ad1d-adf7-4316-9df6-db8a7c1e3933" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.247:8443/dashboard/auth/login/?next=/dashboard/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 14:35:22 crc kubenswrapper[4936]: I0930 14:35:22.488607 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:35:22 crc kubenswrapper[4936]: I0930 14:35:22.489326 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"4a94f16fe2840424daa44165b9f680e409954108884ef70879ace7ca6c302cf2"} pod="openstack/horizon-7b866fc884-w2td6" containerMessage="Container horizon failed startup probe, will be restarted" Sep 30 14:35:22 crc kubenswrapper[4936]: I0930 14:35:22.489380 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b866fc884-w2td6" podUID="1e28ad1d-adf7-4316-9df6-db8a7c1e3933" containerName="horizon" containerID="cri-o://4a94f16fe2840424daa44165b9f680e409954108884ef70879ace7ca6c302cf2" gracePeriod=30 Sep 30 14:35:24 crc kubenswrapper[4936]: I0930 14:35:24.468516 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sqk9d"] Sep 30 14:35:24 crc kubenswrapper[4936]: I0930 14:35:24.471085 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sqk9d" Sep 30 14:35:24 crc kubenswrapper[4936]: I0930 14:35:24.493476 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sqk9d"] Sep 30 14:35:24 crc kubenswrapper[4936]: I0930 14:35:24.546986 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5966eab9-ab72-4448-95de-0a079e01fb7f-utilities\") pod \"certified-operators-sqk9d\" (UID: \"5966eab9-ab72-4448-95de-0a079e01fb7f\") " pod="openshift-marketplace/certified-operators-sqk9d" Sep 30 14:35:24 crc kubenswrapper[4936]: I0930 14:35:24.547295 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds88t\" (UniqueName: \"kubernetes.io/projected/5966eab9-ab72-4448-95de-0a079e01fb7f-kube-api-access-ds88t\") pod \"certified-operators-sqk9d\" (UID: \"5966eab9-ab72-4448-95de-0a079e01fb7f\") " pod="openshift-marketplace/certified-operators-sqk9d" Sep 30 14:35:24 crc kubenswrapper[4936]: I0930 14:35:24.547454 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5966eab9-ab72-4448-95de-0a079e01fb7f-catalog-content\") pod \"certified-operators-sqk9d\" (UID: \"5966eab9-ab72-4448-95de-0a079e01fb7f\") " pod="openshift-marketplace/certified-operators-sqk9d" Sep 30 14:35:24 crc kubenswrapper[4936]: I0930 14:35:24.650948 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5966eab9-ab72-4448-95de-0a079e01fb7f-utilities\") pod \"certified-operators-sqk9d\" (UID: \"5966eab9-ab72-4448-95de-0a079e01fb7f\") " pod="openshift-marketplace/certified-operators-sqk9d" Sep 30 14:35:24 crc kubenswrapper[4936]: I0930 14:35:24.651047 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds88t\" (UniqueName: \"kubernetes.io/projected/5966eab9-ab72-4448-95de-0a079e01fb7f-kube-api-access-ds88t\") pod \"certified-operators-sqk9d\" (UID: \"5966eab9-ab72-4448-95de-0a079e01fb7f\") " pod="openshift-marketplace/certified-operators-sqk9d" Sep 30 14:35:24 crc kubenswrapper[4936]: I0930 14:35:24.651102 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5966eab9-ab72-4448-95de-0a079e01fb7f-catalog-content\") pod \"certified-operators-sqk9d\" (UID: \"5966eab9-ab72-4448-95de-0a079e01fb7f\") " pod="openshift-marketplace/certified-operators-sqk9d" Sep 30 14:35:24 crc kubenswrapper[4936]: I0930 14:35:24.651764 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5966eab9-ab72-4448-95de-0a079e01fb7f-utilities\") pod \"certified-operators-sqk9d\" (UID: \"5966eab9-ab72-4448-95de-0a079e01fb7f\") " pod="openshift-marketplace/certified-operators-sqk9d" Sep 30 14:35:24 crc kubenswrapper[4936]: I0930 14:35:24.652019 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5966eab9-ab72-4448-95de-0a079e01fb7f-catalog-content\") pod \"certified-operators-sqk9d\" (UID: \"5966eab9-ab72-4448-95de-0a079e01fb7f\") " pod="openshift-marketplace/certified-operators-sqk9d" Sep 30 14:35:24 crc kubenswrapper[4936]: I0930 14:35:24.691804 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds88t\" (UniqueName: \"kubernetes.io/projected/5966eab9-ab72-4448-95de-0a079e01fb7f-kube-api-access-ds88t\") pod \"certified-operators-sqk9d\" (UID: \"5966eab9-ab72-4448-95de-0a079e01fb7f\") " pod="openshift-marketplace/certified-operators-sqk9d" Sep 30 14:35:24 crc kubenswrapper[4936]: I0930 14:35:24.825391 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sqk9d" Sep 30 14:35:26 crc kubenswrapper[4936]: I0930 14:35:26.404841 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:35:26 crc kubenswrapper[4936]: I0930 14:35:26.507550 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-ovsdbserver-nb\") pod \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " Sep 30 14:35:26 crc kubenswrapper[4936]: I0930 14:35:26.507638 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-config\") pod \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " Sep 30 14:35:26 crc kubenswrapper[4936]: I0930 14:35:26.507701 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-openstack-edpm-ipam\") pod \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " Sep 30 14:35:26 crc kubenswrapper[4936]: I0930 14:35:26.507774 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ths7\" (UniqueName: \"kubernetes.io/projected/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-kube-api-access-4ths7\") pod \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " Sep 30 14:35:26 crc kubenswrapper[4936]: I0930 14:35:26.507828 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-dns-svc\") pod \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " Sep 30 14:35:26 crc kubenswrapper[4936]: I0930 14:35:26.508042 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-ovsdbserver-sb\") pod \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\" (UID: \"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e\") " Sep 30 14:35:26 crc kubenswrapper[4936]: I0930 14:35:26.533480 4936 generic.go:334] "Generic (PLEG): container finished" podID="bfa0e282-87b9-4509-ad57-429aa110b324" containerID="16366588f2d33c86e7f8823a43493c3e6b00d89b4c8628779ccffe5c17a02362" exitCode=0 Sep 30 14:35:26 crc kubenswrapper[4936]: I0930 14:35:26.536044 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c45676df6-k4rk6" event={"ID":"bfa0e282-87b9-4509-ad57-429aa110b324","Type":"ContainerDied","Data":"16366588f2d33c86e7f8823a43493c3e6b00d89b4c8628779ccffe5c17a02362"} Sep 30 14:35:26 crc kubenswrapper[4936]: I0930 14:35:26.551559 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-kube-api-access-4ths7" (OuterVolumeSpecName: "kube-api-access-4ths7") pod "1f86ab2e-56dc-4b7a-a3a3-b69b0922932e" (UID: "1f86ab2e-56dc-4b7a-a3a3-b69b0922932e"). InnerVolumeSpecName "kube-api-access-4ths7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:35:26 crc kubenswrapper[4936]: I0930 14:35:26.594129 4936 generic.go:334] "Generic (PLEG): container finished" podID="1e28ad1d-adf7-4316-9df6-db8a7c1e3933" containerID="4a94f16fe2840424daa44165b9f680e409954108884ef70879ace7ca6c302cf2" exitCode=0 Sep 30 14:35:26 crc kubenswrapper[4936]: I0930 14:35:26.594231 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b866fc884-w2td6" event={"ID":"1e28ad1d-adf7-4316-9df6-db8a7c1e3933","Type":"ContainerDied","Data":"4a94f16fe2840424daa44165b9f680e409954108884ef70879ace7ca6c302cf2"} Sep 30 14:35:26 crc kubenswrapper[4936]: I0930 14:35:26.612309 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ths7\" (UniqueName: \"kubernetes.io/projected/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-kube-api-access-4ths7\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:26 crc kubenswrapper[4936]: I0930 14:35:26.633666 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" event={"ID":"1f86ab2e-56dc-4b7a-a3a3-b69b0922932e","Type":"ContainerDied","Data":"de759f2acafd06490f3a8733642e87af6b60436f3d369a0ec0deefa1ac064979"} Sep 30 14:35:26 crc kubenswrapper[4936]: I0930 14:35:26.633734 4936 scope.go:117] "RemoveContainer" containerID="3c4879fc349049e199522214d8a42998ffe40ef53b5b6beacbf32c5d51253673" Sep 30 14:35:26 crc kubenswrapper[4936]: I0930 14:35:26.633905 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb68d687f-6n4nt" Sep 30 14:35:26 crc kubenswrapper[4936]: I0930 14:35:26.698455 4936 scope.go:117] "RemoveContainer" containerID="3e061a13e9108cd23c7d0880477eb2b818d01b7f2d24f488329385a81bef2a7d" Sep 30 14:35:26 crc kubenswrapper[4936]: I0930 14:35:26.956516 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Sep 30 14:35:26 crc kubenswrapper[4936]: W0930 14:35:26.973710 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4389a75_ea04_4a03_97df_6063474dd74e.slice/crio-8fbb5fda9e5f59f61f478689af5b2e1cc124623350cf1cfed70614c6d178af4f WatchSource:0}: Error finding container 8fbb5fda9e5f59f61f478689af5b2e1cc124623350cf1cfed70614c6d178af4f: Status 404 returned error can't find the container with id 8fbb5fda9e5f59f61f478689af5b2e1cc124623350cf1cfed70614c6d178af4f Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.012455 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "1f86ab2e-56dc-4b7a-a3a3-b69b0922932e" (UID: "1f86ab2e-56dc-4b7a-a3a3-b69b0922932e"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.058427 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1f86ab2e-56dc-4b7a-a3a3-b69b0922932e" (UID: "1f86ab2e-56dc-4b7a-a3a3-b69b0922932e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.065950 4936 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.069658 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-config" (OuterVolumeSpecName: "config") pod "1f86ab2e-56dc-4b7a-a3a3-b69b0922932e" (UID: "1f86ab2e-56dc-4b7a-a3a3-b69b0922932e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.099092 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1f86ab2e-56dc-4b7a-a3a3-b69b0922932e" (UID: "1f86ab2e-56dc-4b7a-a3a3-b69b0922932e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.117214 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sqk9d"] Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.124730 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1f86ab2e-56dc-4b7a-a3a3-b69b0922932e" (UID: "1f86ab2e-56dc-4b7a-a3a3-b69b0922932e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.169309 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.169423 4936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.169434 4936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.169441 4936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.251723 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.252028 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" containerName="ceilometer-central-agent" containerID="cri-o://31155e3de1ebb1e32246611383b79d224d86f23aba7b0fcd82434a6212289502" gracePeriod=30 Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.252180 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" containerName="proxy-httpd" containerID="cri-o://1dcfe9d384a17bede757c5fd2d6523a1742aa9b6df8337fa8d25aa7c8d932bab" gracePeriod=30 Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.252265 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" containerName="sg-core" containerID="cri-o://e22fa2826711a07486a9cff26bcd07352ea88f10436b201ddfbff8eaeb40c8ae" gracePeriod=30 Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.252322 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" containerName="ceilometer-notification-agent" containerID="cri-o://a26cec4444b1ccba15c6d11da9da04b69f2847ae56603f995eed9feaef2049b1" gracePeriod=30 Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.309451 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb68d687f-6n4nt"] Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.342179 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fb68d687f-6n4nt"] Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.672679 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c45676df6-k4rk6" event={"ID":"bfa0e282-87b9-4509-ad57-429aa110b324","Type":"ContainerStarted","Data":"0933aabe8a2f7739dded14f44facc4f5ad453c69e3eff2f9767a972b84c427d7"} Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.678577 4936 generic.go:334] "Generic (PLEG): container finished" podID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" containerID="e22fa2826711a07486a9cff26bcd07352ea88f10436b201ddfbff8eaeb40c8ae" exitCode=2 Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.678648 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31","Type":"ContainerDied","Data":"e22fa2826711a07486a9cff26bcd07352ea88f10436b201ddfbff8eaeb40c8ae"} Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.687980 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b866fc884-w2td6" event={"ID":"1e28ad1d-adf7-4316-9df6-db8a7c1e3933","Type":"ContainerStarted","Data":"39d7b37300fc1717d0b4898373a73c768a1fe95a9a5c10542482092767aef12d"} Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.691057 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"b4389a75-ea04-4a03-97df-6063474dd74e","Type":"ContainerStarted","Data":"8fbb5fda9e5f59f61f478689af5b2e1cc124623350cf1cfed70614c6d178af4f"} Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.701816 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8bb49504-7154-45ef-bd12-9ce1d3d1cf33","Type":"ContainerStarted","Data":"624a9313dcedb31f0b6334b576690a8d2ab6155090f147808b446dab1378062f"} Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.707029 4936 generic.go:334] "Generic (PLEG): container finished" podID="5966eab9-ab72-4448-95de-0a079e01fb7f" containerID="41dca1c7741789651eedc2dcb596d44e86e34bb2b436d62a8da3328fce91332a" exitCode=0 Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.707072 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqk9d" event={"ID":"5966eab9-ab72-4448-95de-0a079e01fb7f","Type":"ContainerDied","Data":"41dca1c7741789651eedc2dcb596d44e86e34bb2b436d62a8da3328fce91332a"} Sep 30 14:35:27 crc kubenswrapper[4936]: I0930 14:35:27.707100 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqk9d" event={"ID":"5966eab9-ab72-4448-95de-0a079e01fb7f","Type":"ContainerStarted","Data":"c64cdf5788ba022c335e9e86d1b28db3a2939a219354e228690470cbeb6e2d0a"} Sep 30 14:35:28 crc kubenswrapper[4936]: I0930 14:35:28.339770 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f86ab2e-56dc-4b7a-a3a3-b69b0922932e" path="/var/lib/kubelet/pods/1f86ab2e-56dc-4b7a-a3a3-b69b0922932e/volumes" Sep 30 14:35:28 crc kubenswrapper[4936]: I0930 14:35:28.729678 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8bb49504-7154-45ef-bd12-9ce1d3d1cf33","Type":"ContainerStarted","Data":"47c0bd83e71b7b5f4f9c9be42a2a27012eb9ab1353d613f72be574c22fe63065"} Sep 30 14:35:28 crc kubenswrapper[4936]: I0930 14:35:28.736634 4936 generic.go:334] "Generic (PLEG): container finished" podID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" containerID="1dcfe9d384a17bede757c5fd2d6523a1742aa9b6df8337fa8d25aa7c8d932bab" exitCode=0 Sep 30 14:35:28 crc kubenswrapper[4936]: I0930 14:35:28.737025 4936 generic.go:334] "Generic (PLEG): container finished" podID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" containerID="31155e3de1ebb1e32246611383b79d224d86f23aba7b0fcd82434a6212289502" exitCode=0 Sep 30 14:35:28 crc kubenswrapper[4936]: I0930 14:35:28.736858 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31","Type":"ContainerDied","Data":"1dcfe9d384a17bede757c5fd2d6523a1742aa9b6df8337fa8d25aa7c8d932bab"} Sep 30 14:35:28 crc kubenswrapper[4936]: I0930 14:35:28.737121 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31","Type":"ContainerDied","Data":"31155e3de1ebb1e32246611383b79d224d86f23aba7b0fcd82434a6212289502"} Sep 30 14:35:28 crc kubenswrapper[4936]: I0930 14:35:28.749470 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"b4389a75-ea04-4a03-97df-6063474dd74e","Type":"ContainerStarted","Data":"e3217bb43a317ec7ba727a421fe928e279c3d564e9e753ef3dc27301b0761f99"} Sep 30 14:35:29 crc kubenswrapper[4936]: I0930 14:35:29.759542 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"b4389a75-ea04-4a03-97df-6063474dd74e","Type":"ContainerStarted","Data":"8f6d629434a1e8f6a6c5c4fa6732bb682e2f7e07f618f7b1b023575bf160aeb6"} Sep 30 14:35:29 crc kubenswrapper[4936]: I0930 14:35:29.762399 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Sep 30 14:35:29 crc kubenswrapper[4936]: I0930 14:35:29.766155 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqk9d" event={"ID":"5966eab9-ab72-4448-95de-0a079e01fb7f","Type":"ContainerStarted","Data":"e41f8c83dbcbca6d1af7f28fdea811d511308e7bb7ec1643c3f29d5809e23477"} Sep 30 14:35:29 crc kubenswrapper[4936]: I0930 14:35:29.825244 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=5.884687513 podStartE2EDuration="18.825216636s" podCreationTimestamp="2025-09-30 14:35:11 +0000 UTC" firstStartedPulling="2025-09-30 14:35:13.174568998 +0000 UTC m=+3363.558571299" lastFinishedPulling="2025-09-30 14:35:26.115098121 +0000 UTC m=+3376.499100422" observedRunningTime="2025-09-30 14:35:28.766250861 +0000 UTC m=+3379.150253172" watchObservedRunningTime="2025-09-30 14:35:29.825216636 +0000 UTC m=+3380.209218937" Sep 30 14:35:29 crc kubenswrapper[4936]: I0930 14:35:29.839467 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=10.839441457 podStartE2EDuration="10.839441457s" podCreationTimestamp="2025-09-30 14:35:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:35:29.798975005 +0000 UTC m=+3380.182977306" watchObservedRunningTime="2025-09-30 14:35:29.839441457 +0000 UTC m=+3380.223443758" Sep 30 14:35:30 crc kubenswrapper[4936]: I0930 14:35:30.778444 4936 generic.go:334] "Generic (PLEG): container finished" podID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" containerID="a26cec4444b1ccba15c6d11da9da04b69f2847ae56603f995eed9feaef2049b1" exitCode=0 Sep 30 14:35:30 crc kubenswrapper[4936]: I0930 14:35:30.778526 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31","Type":"ContainerDied","Data":"a26cec4444b1ccba15c6d11da9da04b69f2847ae56603f995eed9feaef2049b1"} Sep 30 14:35:30 crc kubenswrapper[4936]: I0930 14:35:30.783070 4936 generic.go:334] "Generic (PLEG): container finished" podID="5966eab9-ab72-4448-95de-0a079e01fb7f" containerID="e41f8c83dbcbca6d1af7f28fdea811d511308e7bb7ec1643c3f29d5809e23477" exitCode=0 Sep 30 14:35:30 crc kubenswrapper[4936]: I0930 14:35:30.783135 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqk9d" event={"ID":"5966eab9-ab72-4448-95de-0a079e01fb7f","Type":"ContainerDied","Data":"e41f8c83dbcbca6d1af7f28fdea811d511308e7bb7ec1643c3f29d5809e23477"} Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.732496 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.793894 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqk9d" event={"ID":"5966eab9-ab72-4448-95de-0a079e01fb7f","Type":"ContainerStarted","Data":"03b904933841ee44b60c9cd23a350d721487277ff2d1d682cbaeb81d2c697df0"} Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.798318 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.798314 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31","Type":"ContainerDied","Data":"ac4d2539fd9953e9cf3c49fde4ef7561dd6a73d3bdf94ec7216a97350eb960f4"} Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.798401 4936 scope.go:117] "RemoveContainer" containerID="1dcfe9d384a17bede757c5fd2d6523a1742aa9b6df8337fa8d25aa7c8d932bab" Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.809262 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-sg-core-conf-yaml\") pod \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.809848 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-config-data\") pod \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.810510 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-ceilometer-tls-certs\") pod \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.810630 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-log-httpd\") pod \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.810659 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-combined-ca-bundle\") pod \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.810797 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-run-httpd\") pod \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.810887 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-scripts\") pod \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.810928 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ql5w\" (UniqueName: \"kubernetes.io/projected/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-kube-api-access-5ql5w\") pod \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\" (UID: \"b3ebf958-8faf-43dc-b4f4-7ba0c6832a31\") " Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.811482 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" (UID: "b3ebf958-8faf-43dc-b4f4-7ba0c6832a31"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.811726 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" (UID: "b3ebf958-8faf-43dc-b4f4-7ba0c6832a31"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.823590 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sqk9d" podStartSLOduration=4.274064758 podStartE2EDuration="7.82356279s" podCreationTimestamp="2025-09-30 14:35:24 +0000 UTC" firstStartedPulling="2025-09-30 14:35:27.711223524 +0000 UTC m=+3378.095225825" lastFinishedPulling="2025-09-30 14:35:31.260721556 +0000 UTC m=+3381.644723857" observedRunningTime="2025-09-30 14:35:31.819202781 +0000 UTC m=+3382.203205082" watchObservedRunningTime="2025-09-30 14:35:31.82356279 +0000 UTC m=+3382.207565101" Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.830094 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-kube-api-access-5ql5w" (OuterVolumeSpecName: "kube-api-access-5ql5w") pod "b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" (UID: "b3ebf958-8faf-43dc-b4f4-7ba0c6832a31"). InnerVolumeSpecName "kube-api-access-5ql5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.838326 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-scripts" (OuterVolumeSpecName: "scripts") pod "b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" (UID: "b3ebf958-8faf-43dc-b4f4-7ba0c6832a31"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.843581 4936 scope.go:117] "RemoveContainer" containerID="e22fa2826711a07486a9cff26bcd07352ea88f10436b201ddfbff8eaeb40c8ae" Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.872238 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.877310 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" (UID: "b3ebf958-8faf-43dc-b4f4-7ba0c6832a31"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.914148 4936 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.914179 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.914188 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ql5w\" (UniqueName: \"kubernetes.io/projected/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-kube-api-access-5ql5w\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.914200 4936 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:31 crc kubenswrapper[4936]: I0930 14:35:31.914209 4936 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.051003 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" (UID: "b3ebf958-8faf-43dc-b4f4-7ba0c6832a31"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.060553 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-config-data" (OuterVolumeSpecName: "config-data") pod "b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" (UID: "b3ebf958-8faf-43dc-b4f4-7ba0c6832a31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.102298 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" (UID: "b3ebf958-8faf-43dc-b4f4-7ba0c6832a31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.119849 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.119910 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.119924 4936 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.217836 4936 scope.go:117] "RemoveContainer" containerID="a26cec4444b1ccba15c6d11da9da04b69f2847ae56603f995eed9feaef2049b1" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.269789 4936 scope.go:117] "RemoveContainer" containerID="31155e3de1ebb1e32246611383b79d224d86f23aba7b0fcd82434a6212289502" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.431307 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.441748 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.453539 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:35:32 crc kubenswrapper[4936]: E0930 14:35:32.454028 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" containerName="ceilometer-notification-agent" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.454057 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" containerName="ceilometer-notification-agent" Sep 30 14:35:32 crc kubenswrapper[4936]: E0930 14:35:32.454073 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f86ab2e-56dc-4b7a-a3a3-b69b0922932e" containerName="dnsmasq-dns" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.454082 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f86ab2e-56dc-4b7a-a3a3-b69b0922932e" containerName="dnsmasq-dns" Sep 30 14:35:32 crc kubenswrapper[4936]: E0930 14:35:32.454097 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" containerName="sg-core" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.454106 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" containerName="sg-core" Sep 30 14:35:32 crc kubenswrapper[4936]: E0930 14:35:32.454118 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" containerName="proxy-httpd" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.454125 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" containerName="proxy-httpd" Sep 30 14:35:32 crc kubenswrapper[4936]: E0930 14:35:32.454140 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f86ab2e-56dc-4b7a-a3a3-b69b0922932e" containerName="init" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.454149 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f86ab2e-56dc-4b7a-a3a3-b69b0922932e" containerName="init" Sep 30 14:35:32 crc kubenswrapper[4936]: E0930 14:35:32.454173 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" containerName="ceilometer-central-agent" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.454183 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" containerName="ceilometer-central-agent" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.454430 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" containerName="proxy-httpd" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.454450 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f86ab2e-56dc-4b7a-a3a3-b69b0922932e" containerName="dnsmasq-dns" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.454470 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" containerName="ceilometer-notification-agent" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.454482 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" containerName="sg-core" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.454499 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" containerName="ceilometer-central-agent" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.456964 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.462033 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.462425 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.467705 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.485734 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.530683 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70edb384-47db-473c-95d3-28a20a1857e0-log-httpd\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.530903 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70edb384-47db-473c-95d3-28a20a1857e0-scripts\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.531139 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70edb384-47db-473c-95d3-28a20a1857e0-run-httpd\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.531273 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70edb384-47db-473c-95d3-28a20a1857e0-config-data\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.531472 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/70edb384-47db-473c-95d3-28a20a1857e0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.531659 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70edb384-47db-473c-95d3-28a20a1857e0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.531687 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70edb384-47db-473c-95d3-28a20a1857e0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.531712 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t92bc\" (UniqueName: \"kubernetes.io/projected/70edb384-47db-473c-95d3-28a20a1857e0-kube-api-access-t92bc\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.633499 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70edb384-47db-473c-95d3-28a20a1857e0-scripts\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.633627 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70edb384-47db-473c-95d3-28a20a1857e0-run-httpd\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.633673 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70edb384-47db-473c-95d3-28a20a1857e0-config-data\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.633725 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/70edb384-47db-473c-95d3-28a20a1857e0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.633781 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70edb384-47db-473c-95d3-28a20a1857e0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.633802 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70edb384-47db-473c-95d3-28a20a1857e0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.633826 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t92bc\" (UniqueName: \"kubernetes.io/projected/70edb384-47db-473c-95d3-28a20a1857e0-kube-api-access-t92bc\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.633848 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70edb384-47db-473c-95d3-28a20a1857e0-log-httpd\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.634844 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70edb384-47db-473c-95d3-28a20a1857e0-log-httpd\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.635884 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70edb384-47db-473c-95d3-28a20a1857e0-run-httpd\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.639047 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70edb384-47db-473c-95d3-28a20a1857e0-scripts\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.639135 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70edb384-47db-473c-95d3-28a20a1857e0-config-data\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.641746 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70edb384-47db-473c-95d3-28a20a1857e0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.641993 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/70edb384-47db-473c-95d3-28a20a1857e0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.642626 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70edb384-47db-473c-95d3-28a20a1857e0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.653189 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t92bc\" (UniqueName: \"kubernetes.io/projected/70edb384-47db-473c-95d3-28a20a1857e0-kube-api-access-t92bc\") pod \"ceilometer-0\" (UID: \"70edb384-47db-473c-95d3-28a20a1857e0\") " pod="openstack/ceilometer-0" Sep 30 14:35:32 crc kubenswrapper[4936]: I0930 14:35:32.786395 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:35:33 crc kubenswrapper[4936]: I0930 14:35:33.337277 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:35:33 crc kubenswrapper[4936]: I0930 14:35:33.821220 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70edb384-47db-473c-95d3-28a20a1857e0","Type":"ContainerStarted","Data":"84d0ea8db732c2226fb53b836e23a7af198d7dab8c8f66310e47fce5fc60dc81"} Sep 30 14:35:34 crc kubenswrapper[4936]: I0930 14:35:34.338237 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3ebf958-8faf-43dc-b4f4-7ba0c6832a31" path="/var/lib/kubelet/pods/b3ebf958-8faf-43dc-b4f4-7ba0c6832a31/volumes" Sep 30 14:35:34 crc kubenswrapper[4936]: I0930 14:35:34.826810 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sqk9d" Sep 30 14:35:34 crc kubenswrapper[4936]: I0930 14:35:34.827172 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sqk9d" Sep 30 14:35:34 crc kubenswrapper[4936]: I0930 14:35:34.829881 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Sep 30 14:35:34 crc kubenswrapper[4936]: I0930 14:35:34.870193 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70edb384-47db-473c-95d3-28a20a1857e0","Type":"ContainerStarted","Data":"957e51ba102c15151336f25da0d0a70036c289241a9a0cf261f08614e3576b08"} Sep 30 14:35:34 crc kubenswrapper[4936]: I0930 14:35:34.885867 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Sep 30 14:35:35 crc kubenswrapper[4936]: I0930 14:35:35.885696 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0" containerName="manila-scheduler" containerID="cri-o://dd4707b63739c1ee8605f50555a7aac8746f8b89ad1eb634d1772040c33ab293" gracePeriod=30 Sep 30 14:35:35 crc kubenswrapper[4936]: I0930 14:35:35.886403 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70edb384-47db-473c-95d3-28a20a1857e0","Type":"ContainerStarted","Data":"41309f2453eff15927bb6c2950b0485f85b7be83dafd10279ad1e39148f5f72e"} Sep 30 14:35:35 crc kubenswrapper[4936]: I0930 14:35:35.886752 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0" containerName="probe" containerID="cri-o://6565d8446c76e22d8ff127d651af2f997a103c104891862cda2329f3f3672eda" gracePeriod=30 Sep 30 14:35:35 crc kubenswrapper[4936]: I0930 14:35:35.956931 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-sqk9d" podUID="5966eab9-ab72-4448-95de-0a079e01fb7f" containerName="registry-server" probeResult="failure" output=< Sep 30 14:35:35 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 14:35:35 crc kubenswrapper[4936]: > Sep 30 14:35:36 crc kubenswrapper[4936]: I0930 14:35:36.780256 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hk7c7"] Sep 30 14:35:36 crc kubenswrapper[4936]: I0930 14:35:36.782761 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hk7c7" Sep 30 14:35:36 crc kubenswrapper[4936]: I0930 14:35:36.789019 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hk7c7"] Sep 30 14:35:36 crc kubenswrapper[4936]: I0930 14:35:36.895722 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70edb384-47db-473c-95d3-28a20a1857e0","Type":"ContainerStarted","Data":"23c09b28e2702e9d02bf74529c51f6174d71a2b53338e7337cb47fc4b20524bd"} Sep 30 14:35:36 crc kubenswrapper[4936]: I0930 14:35:36.901017 4936 generic.go:334] "Generic (PLEG): container finished" podID="a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0" containerID="6565d8446c76e22d8ff127d651af2f997a103c104891862cda2329f3f3672eda" exitCode=0 Sep 30 14:35:36 crc kubenswrapper[4936]: I0930 14:35:36.901110 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0","Type":"ContainerDied","Data":"6565d8446c76e22d8ff127d651af2f997a103c104891862cda2329f3f3672eda"} Sep 30 14:35:36 crc kubenswrapper[4936]: I0930 14:35:36.954769 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8757117d-bfbe-4a6d-86c2-38b828997274-utilities\") pod \"redhat-operators-hk7c7\" (UID: \"8757117d-bfbe-4a6d-86c2-38b828997274\") " pod="openshift-marketplace/redhat-operators-hk7c7" Sep 30 14:35:36 crc kubenswrapper[4936]: I0930 14:35:36.955085 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8757117d-bfbe-4a6d-86c2-38b828997274-catalog-content\") pod \"redhat-operators-hk7c7\" (UID: \"8757117d-bfbe-4a6d-86c2-38b828997274\") " pod="openshift-marketplace/redhat-operators-hk7c7" Sep 30 14:35:36 crc kubenswrapper[4936]: I0930 14:35:36.955202 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxwpm\" (UniqueName: \"kubernetes.io/projected/8757117d-bfbe-4a6d-86c2-38b828997274-kube-api-access-vxwpm\") pod \"redhat-operators-hk7c7\" (UID: \"8757117d-bfbe-4a6d-86c2-38b828997274\") " pod="openshift-marketplace/redhat-operators-hk7c7" Sep 30 14:35:37 crc kubenswrapper[4936]: I0930 14:35:37.042305 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:35:37 crc kubenswrapper[4936]: I0930 14:35:37.042408 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:35:37 crc kubenswrapper[4936]: I0930 14:35:37.045186 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7c45676df6-k4rk6" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.246:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.246:8443: connect: connection refused" Sep 30 14:35:37 crc kubenswrapper[4936]: I0930 14:35:37.057686 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8757117d-bfbe-4a6d-86c2-38b828997274-catalog-content\") pod \"redhat-operators-hk7c7\" (UID: \"8757117d-bfbe-4a6d-86c2-38b828997274\") " pod="openshift-marketplace/redhat-operators-hk7c7" Sep 30 14:35:37 crc kubenswrapper[4936]: I0930 14:35:37.057787 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxwpm\" (UniqueName: \"kubernetes.io/projected/8757117d-bfbe-4a6d-86c2-38b828997274-kube-api-access-vxwpm\") pod \"redhat-operators-hk7c7\" (UID: \"8757117d-bfbe-4a6d-86c2-38b828997274\") " pod="openshift-marketplace/redhat-operators-hk7c7" Sep 30 14:35:37 crc kubenswrapper[4936]: I0930 14:35:37.057889 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8757117d-bfbe-4a6d-86c2-38b828997274-utilities\") pod \"redhat-operators-hk7c7\" (UID: \"8757117d-bfbe-4a6d-86c2-38b828997274\") " pod="openshift-marketplace/redhat-operators-hk7c7" Sep 30 14:35:37 crc kubenswrapper[4936]: I0930 14:35:37.058482 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8757117d-bfbe-4a6d-86c2-38b828997274-utilities\") pod \"redhat-operators-hk7c7\" (UID: \"8757117d-bfbe-4a6d-86c2-38b828997274\") " pod="openshift-marketplace/redhat-operators-hk7c7" Sep 30 14:35:37 crc kubenswrapper[4936]: I0930 14:35:37.058737 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8757117d-bfbe-4a6d-86c2-38b828997274-catalog-content\") pod \"redhat-operators-hk7c7\" (UID: \"8757117d-bfbe-4a6d-86c2-38b828997274\") " pod="openshift-marketplace/redhat-operators-hk7c7" Sep 30 14:35:37 crc kubenswrapper[4936]: I0930 14:35:37.081681 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxwpm\" (UniqueName: \"kubernetes.io/projected/8757117d-bfbe-4a6d-86c2-38b828997274-kube-api-access-vxwpm\") pod \"redhat-operators-hk7c7\" (UID: \"8757117d-bfbe-4a6d-86c2-38b828997274\") " pod="openshift-marketplace/redhat-operators-hk7c7" Sep 30 14:35:37 crc kubenswrapper[4936]: I0930 14:35:37.105239 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hk7c7" Sep 30 14:35:37 crc kubenswrapper[4936]: I0930 14:35:37.479873 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:35:37 crc kubenswrapper[4936]: I0930 14:35:37.479904 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:35:37 crc kubenswrapper[4936]: I0930 14:35:37.487465 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b866fc884-w2td6" podUID="1e28ad1d-adf7-4316-9df6-db8a7c1e3933" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.247:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.247:8443: connect: connection refused" Sep 30 14:35:37 crc kubenswrapper[4936]: I0930 14:35:37.817685 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hk7c7"] Sep 30 14:35:37 crc kubenswrapper[4936]: W0930 14:35:37.841637 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8757117d_bfbe_4a6d_86c2_38b828997274.slice/crio-da55c72338635390355a1cdbef598de52100093f4415aebd68fa0d1dc9141eb4 WatchSource:0}: Error finding container da55c72338635390355a1cdbef598de52100093f4415aebd68fa0d1dc9141eb4: Status 404 returned error can't find the container with id da55c72338635390355a1cdbef598de52100093f4415aebd68fa0d1dc9141eb4 Sep 30 14:35:37 crc kubenswrapper[4936]: I0930 14:35:37.918967 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hk7c7" event={"ID":"8757117d-bfbe-4a6d-86c2-38b828997274","Type":"ContainerStarted","Data":"da55c72338635390355a1cdbef598de52100093f4415aebd68fa0d1dc9141eb4"} Sep 30 14:35:37 crc kubenswrapper[4936]: I0930 14:35:37.924714 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70edb384-47db-473c-95d3-28a20a1857e0","Type":"ContainerStarted","Data":"083698f8fd6bcd5a49a31f679506f4a9620c5d226755b54bf1f694743ddb1ccf"} Sep 30 14:35:37 crc kubenswrapper[4936]: I0930 14:35:37.924849 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 14:35:37 crc kubenswrapper[4936]: I0930 14:35:37.951193 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.056083179 podStartE2EDuration="5.951167075s" podCreationTimestamp="2025-09-30 14:35:32 +0000 UTC" firstStartedPulling="2025-09-30 14:35:33.342649368 +0000 UTC m=+3383.726651669" lastFinishedPulling="2025-09-30 14:35:37.237733274 +0000 UTC m=+3387.621735565" observedRunningTime="2025-09-30 14:35:37.944749119 +0000 UTC m=+3388.328751440" watchObservedRunningTime="2025-09-30 14:35:37.951167075 +0000 UTC m=+3388.335169376" Sep 30 14:35:38 crc kubenswrapper[4936]: I0930 14:35:38.935382 4936 generic.go:334] "Generic (PLEG): container finished" podID="8757117d-bfbe-4a6d-86c2-38b828997274" containerID="8806ede1b27274b37e062741a614b64e6e6d647927d41db2b16e990499f51f8f" exitCode=0 Sep 30 14:35:38 crc kubenswrapper[4936]: I0930 14:35:38.935572 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hk7c7" event={"ID":"8757117d-bfbe-4a6d-86c2-38b828997274","Type":"ContainerDied","Data":"8806ede1b27274b37e062741a614b64e6e6d647927d41db2b16e990499f51f8f"} Sep 30 14:35:39 crc kubenswrapper[4936]: I0930 14:35:39.955261 4936 generic.go:334] "Generic (PLEG): container finished" podID="a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0" containerID="dd4707b63739c1ee8605f50555a7aac8746f8b89ad1eb634d1772040c33ab293" exitCode=0 Sep 30 14:35:39 crc kubenswrapper[4936]: I0930 14:35:39.955455 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0","Type":"ContainerDied","Data":"dd4707b63739c1ee8605f50555a7aac8746f8b89ad1eb634d1772040c33ab293"} Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.343177 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.430425 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-combined-ca-bundle\") pod \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.430525 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-config-data-custom\") pod \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.430579 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-etc-machine-id\") pod \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.430686 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-scripts\") pod \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.430738 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-config-data\") pod \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.430862 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0" (UID: "a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.431250 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64j2d\" (UniqueName: \"kubernetes.io/projected/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-kube-api-access-64j2d\") pod \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\" (UID: \"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0\") " Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.432086 4936 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.447511 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-scripts" (OuterVolumeSpecName: "scripts") pod "a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0" (UID: "a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.472150 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-kube-api-access-64j2d" (OuterVolumeSpecName: "kube-api-access-64j2d") pod "a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0" (UID: "a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0"). InnerVolumeSpecName "kube-api-access-64j2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.472225 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0" (UID: "a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.542255 4936 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.542297 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.542309 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64j2d\" (UniqueName: \"kubernetes.io/projected/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-kube-api-access-64j2d\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.593665 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0" (UID: "a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.643659 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.660654 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-config-data" (OuterVolumeSpecName: "config-data") pod "a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0" (UID: "a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.746090 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.968729 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0","Type":"ContainerDied","Data":"fc656b4d7a546b7b95831f52ec8bc67aac22a73d191717e0894d1f6a08c16ee5"} Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.969196 4936 scope.go:117] "RemoveContainer" containerID="6565d8446c76e22d8ff127d651af2f997a103c104891862cda2329f3f3672eda" Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.968744 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Sep 30 14:35:40 crc kubenswrapper[4936]: I0930 14:35:40.972587 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hk7c7" event={"ID":"8757117d-bfbe-4a6d-86c2-38b828997274","Type":"ContainerStarted","Data":"66ee0662281fa0a3d03cae0f8e16599ce84f922fb7825cb64dd20649728ce15d"} Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.012651 4936 scope.go:117] "RemoveContainer" containerID="dd4707b63739c1ee8605f50555a7aac8746f8b89ad1eb634d1772040c33ab293" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.042628 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.061510 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.076662 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Sep 30 14:35:41 crc kubenswrapper[4936]: E0930 14:35:41.077149 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0" containerName="probe" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.077174 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0" containerName="probe" Sep 30 14:35:41 crc kubenswrapper[4936]: E0930 14:35:41.077204 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0" containerName="manila-scheduler" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.077212 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0" containerName="manila-scheduler" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.077510 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0" containerName="probe" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.077531 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0" containerName="manila-scheduler" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.078665 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.083426 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.091177 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.156666 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fa47185-cf71-4145-b46f-f524902914f3-config-data\") pod \"manila-scheduler-0\" (UID: \"8fa47185-cf71-4145-b46f-f524902914f3\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.156771 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fa47185-cf71-4145-b46f-f524902914f3-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8fa47185-cf71-4145-b46f-f524902914f3\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.156788 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fa47185-cf71-4145-b46f-f524902914f3-scripts\") pod \"manila-scheduler-0\" (UID: \"8fa47185-cf71-4145-b46f-f524902914f3\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.156823 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fa47185-cf71-4145-b46f-f524902914f3-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8fa47185-cf71-4145-b46f-f524902914f3\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.156911 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fa47185-cf71-4145-b46f-f524902914f3-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8fa47185-cf71-4145-b46f-f524902914f3\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.156927 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zzw4\" (UniqueName: \"kubernetes.io/projected/8fa47185-cf71-4145-b46f-f524902914f3-kube-api-access-6zzw4\") pod \"manila-scheduler-0\" (UID: \"8fa47185-cf71-4145-b46f-f524902914f3\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.260625 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fa47185-cf71-4145-b46f-f524902914f3-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8fa47185-cf71-4145-b46f-f524902914f3\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.260676 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fa47185-cf71-4145-b46f-f524902914f3-scripts\") pod \"manila-scheduler-0\" (UID: \"8fa47185-cf71-4145-b46f-f524902914f3\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.260714 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fa47185-cf71-4145-b46f-f524902914f3-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8fa47185-cf71-4145-b46f-f524902914f3\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.260795 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fa47185-cf71-4145-b46f-f524902914f3-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8fa47185-cf71-4145-b46f-f524902914f3\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.260812 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zzw4\" (UniqueName: \"kubernetes.io/projected/8fa47185-cf71-4145-b46f-f524902914f3-kube-api-access-6zzw4\") pod \"manila-scheduler-0\" (UID: \"8fa47185-cf71-4145-b46f-f524902914f3\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.260866 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fa47185-cf71-4145-b46f-f524902914f3-config-data\") pod \"manila-scheduler-0\" (UID: \"8fa47185-cf71-4145-b46f-f524902914f3\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.261627 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fa47185-cf71-4145-b46f-f524902914f3-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8fa47185-cf71-4145-b46f-f524902914f3\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.270320 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fa47185-cf71-4145-b46f-f524902914f3-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8fa47185-cf71-4145-b46f-f524902914f3\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.270548 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fa47185-cf71-4145-b46f-f524902914f3-config-data\") pod \"manila-scheduler-0\" (UID: \"8fa47185-cf71-4145-b46f-f524902914f3\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.287058 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fa47185-cf71-4145-b46f-f524902914f3-scripts\") pod \"manila-scheduler-0\" (UID: \"8fa47185-cf71-4145-b46f-f524902914f3\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.288928 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fa47185-cf71-4145-b46f-f524902914f3-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8fa47185-cf71-4145-b46f-f524902914f3\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.292913 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zzw4\" (UniqueName: \"kubernetes.io/projected/8fa47185-cf71-4145-b46f-f524902914f3-kube-api-access-6zzw4\") pod \"manila-scheduler-0\" (UID: \"8fa47185-cf71-4145-b46f-f524902914f3\") " pod="openstack/manila-scheduler-0" Sep 30 14:35:41 crc kubenswrapper[4936]: I0930 14:35:41.402481 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Sep 30 14:35:42 crc kubenswrapper[4936]: I0930 14:35:42.053864 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Sep 30 14:35:42 crc kubenswrapper[4936]: I0930 14:35:42.382486 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0" path="/var/lib/kubelet/pods/a2d4f49d-8397-40f6-b24e-b5fe86fb0cf0/volumes" Sep 30 14:35:43 crc kubenswrapper[4936]: I0930 14:35:43.008439 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8fa47185-cf71-4145-b46f-f524902914f3","Type":"ContainerStarted","Data":"afb4929513f4020bcb8683d07284a848ce81b23a7eb37066d02062e3543a8814"} Sep 30 14:35:43 crc kubenswrapper[4936]: I0930 14:35:43.008912 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8fa47185-cf71-4145-b46f-f524902914f3","Type":"ContainerStarted","Data":"ba991b7a666adb79d92d302d84cb46f417e5297dc2a28525259beea212cc7852"} Sep 30 14:35:44 crc kubenswrapper[4936]: I0930 14:35:44.019248 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8fa47185-cf71-4145-b46f-f524902914f3","Type":"ContainerStarted","Data":"e0be012a490c30213b43de44df2f2f1fe2ad1bb050a7007062c772c2a7c226ff"} Sep 30 14:35:44 crc kubenswrapper[4936]: I0930 14:35:44.062153 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.062134784 podStartE2EDuration="3.062134784s" podCreationTimestamp="2025-09-30 14:35:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:35:44.055898323 +0000 UTC m=+3394.439900634" watchObservedRunningTime="2025-09-30 14:35:44.062134784 +0000 UTC m=+3394.446137085" Sep 30 14:35:44 crc kubenswrapper[4936]: I0930 14:35:44.196256 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Sep 30 14:35:44 crc kubenswrapper[4936]: I0930 14:35:44.865832 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Sep 30 14:35:44 crc kubenswrapper[4936]: I0930 14:35:44.941410 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Sep 30 14:35:45 crc kubenswrapper[4936]: I0930 14:35:45.029105 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="8bb49504-7154-45ef-bd12-9ce1d3d1cf33" containerName="manila-share" containerID="cri-o://624a9313dcedb31f0b6334b576690a8d2ab6155090f147808b446dab1378062f" gracePeriod=30 Sep 30 14:35:45 crc kubenswrapper[4936]: I0930 14:35:45.030119 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="8bb49504-7154-45ef-bd12-9ce1d3d1cf33" containerName="probe" containerID="cri-o://47c0bd83e71b7b5f4f9c9be42a2a27012eb9ab1353d613f72be574c22fe63065" gracePeriod=30 Sep 30 14:35:45 crc kubenswrapper[4936]: I0930 14:35:45.941374 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-sqk9d" podUID="5966eab9-ab72-4448-95de-0a079e01fb7f" containerName="registry-server" probeResult="failure" output=< Sep 30 14:35:45 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 14:35:45 crc kubenswrapper[4936]: > Sep 30 14:35:46 crc kubenswrapper[4936]: I0930 14:35:46.040933 4936 generic.go:334] "Generic (PLEG): container finished" podID="8bb49504-7154-45ef-bd12-9ce1d3d1cf33" containerID="47c0bd83e71b7b5f4f9c9be42a2a27012eb9ab1353d613f72be574c22fe63065" exitCode=0 Sep 30 14:35:46 crc kubenswrapper[4936]: I0930 14:35:46.040997 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8bb49504-7154-45ef-bd12-9ce1d3d1cf33","Type":"ContainerDied","Data":"47c0bd83e71b7b5f4f9c9be42a2a27012eb9ab1353d613f72be574c22fe63065"} Sep 30 14:35:46 crc kubenswrapper[4936]: I0930 14:35:46.772178 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Sep 30 14:35:46 crc kubenswrapper[4936]: I0930 14:35:46.910111 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-ceph\") pod \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " Sep 30 14:35:46 crc kubenswrapper[4936]: I0930 14:35:46.910175 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-config-data\") pod \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " Sep 30 14:35:46 crc kubenswrapper[4936]: I0930 14:35:46.910260 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-combined-ca-bundle\") pod \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " Sep 30 14:35:46 crc kubenswrapper[4936]: I0930 14:35:46.910292 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-etc-machine-id\") pod \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " Sep 30 14:35:46 crc kubenswrapper[4936]: I0930 14:35:46.910315 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-scripts\") pod \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " Sep 30 14:35:46 crc kubenswrapper[4936]: I0930 14:35:46.910368 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-var-lib-manila\") pod \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " Sep 30 14:35:46 crc kubenswrapper[4936]: I0930 14:35:46.910434 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-config-data-custom\") pod \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " Sep 30 14:35:46 crc kubenswrapper[4936]: I0930 14:35:46.910476 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdvv6\" (UniqueName: \"kubernetes.io/projected/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-kube-api-access-fdvv6\") pod \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\" (UID: \"8bb49504-7154-45ef-bd12-9ce1d3d1cf33\") " Sep 30 14:35:46 crc kubenswrapper[4936]: I0930 14:35:46.911144 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8bb49504-7154-45ef-bd12-9ce1d3d1cf33" (UID: "8bb49504-7154-45ef-bd12-9ce1d3d1cf33"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:35:46 crc kubenswrapper[4936]: I0930 14:35:46.911450 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "8bb49504-7154-45ef-bd12-9ce1d3d1cf33" (UID: "8bb49504-7154-45ef-bd12-9ce1d3d1cf33"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:35:46 crc kubenswrapper[4936]: I0930 14:35:46.920567 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-ceph" (OuterVolumeSpecName: "ceph") pod "8bb49504-7154-45ef-bd12-9ce1d3d1cf33" (UID: "8bb49504-7154-45ef-bd12-9ce1d3d1cf33"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:35:46 crc kubenswrapper[4936]: I0930 14:35:46.920796 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-kube-api-access-fdvv6" (OuterVolumeSpecName: "kube-api-access-fdvv6") pod "8bb49504-7154-45ef-bd12-9ce1d3d1cf33" (UID: "8bb49504-7154-45ef-bd12-9ce1d3d1cf33"). InnerVolumeSpecName "kube-api-access-fdvv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:35:46 crc kubenswrapper[4936]: I0930 14:35:46.921096 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8bb49504-7154-45ef-bd12-9ce1d3d1cf33" (UID: "8bb49504-7154-45ef-bd12-9ce1d3d1cf33"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:46 crc kubenswrapper[4936]: I0930 14:35:46.926501 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-scripts" (OuterVolumeSpecName: "scripts") pod "8bb49504-7154-45ef-bd12-9ce1d3d1cf33" (UID: "8bb49504-7154-45ef-bd12-9ce1d3d1cf33"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:46 crc kubenswrapper[4936]: I0930 14:35:46.998060 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bb49504-7154-45ef-bd12-9ce1d3d1cf33" (UID: "8bb49504-7154-45ef-bd12-9ce1d3d1cf33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.014887 4936 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.015121 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdvv6\" (UniqueName: \"kubernetes.io/projected/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-kube-api-access-fdvv6\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.015218 4936 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.015303 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.017400 4936 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.017479 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.017560 4936 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-var-lib-manila\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.042464 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7c45676df6-k4rk6" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.246:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.246:8443: connect: connection refused" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.079874 4936 generic.go:334] "Generic (PLEG): container finished" podID="8bb49504-7154-45ef-bd12-9ce1d3d1cf33" containerID="624a9313dcedb31f0b6334b576690a8d2ab6155090f147808b446dab1378062f" exitCode=1 Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.079967 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8bb49504-7154-45ef-bd12-9ce1d3d1cf33","Type":"ContainerDied","Data":"624a9313dcedb31f0b6334b576690a8d2ab6155090f147808b446dab1378062f"} Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.080000 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8bb49504-7154-45ef-bd12-9ce1d3d1cf33","Type":"ContainerDied","Data":"d13a1bb09fcc6452b3b25042f4cbccb0971a4a298444f97a2a1108e6274bc08b"} Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.080043 4936 scope.go:117] "RemoveContainer" containerID="47c0bd83e71b7b5f4f9c9be42a2a27012eb9ab1353d613f72be574c22fe63065" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.080221 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.095504 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-config-data" (OuterVolumeSpecName: "config-data") pod "8bb49504-7154-45ef-bd12-9ce1d3d1cf33" (UID: "8bb49504-7154-45ef-bd12-9ce1d3d1cf33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.095773 4936 generic.go:334] "Generic (PLEG): container finished" podID="8757117d-bfbe-4a6d-86c2-38b828997274" containerID="66ee0662281fa0a3d03cae0f8e16599ce84f922fb7825cb64dd20649728ce15d" exitCode=0 Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.095812 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hk7c7" event={"ID":"8757117d-bfbe-4a6d-86c2-38b828997274","Type":"ContainerDied","Data":"66ee0662281fa0a3d03cae0f8e16599ce84f922fb7825cb64dd20649728ce15d"} Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.122420 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bb49504-7154-45ef-bd12-9ce1d3d1cf33-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.127776 4936 scope.go:117] "RemoveContainer" containerID="624a9313dcedb31f0b6334b576690a8d2ab6155090f147808b446dab1378062f" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.167177 4936 scope.go:117] "RemoveContainer" containerID="47c0bd83e71b7b5f4f9c9be42a2a27012eb9ab1353d613f72be574c22fe63065" Sep 30 14:35:47 crc kubenswrapper[4936]: E0930 14:35:47.167731 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47c0bd83e71b7b5f4f9c9be42a2a27012eb9ab1353d613f72be574c22fe63065\": container with ID starting with 47c0bd83e71b7b5f4f9c9be42a2a27012eb9ab1353d613f72be574c22fe63065 not found: ID does not exist" containerID="47c0bd83e71b7b5f4f9c9be42a2a27012eb9ab1353d613f72be574c22fe63065" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.167791 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47c0bd83e71b7b5f4f9c9be42a2a27012eb9ab1353d613f72be574c22fe63065"} err="failed to get container status \"47c0bd83e71b7b5f4f9c9be42a2a27012eb9ab1353d613f72be574c22fe63065\": rpc error: code = NotFound desc = could not find container \"47c0bd83e71b7b5f4f9c9be42a2a27012eb9ab1353d613f72be574c22fe63065\": container with ID starting with 47c0bd83e71b7b5f4f9c9be42a2a27012eb9ab1353d613f72be574c22fe63065 not found: ID does not exist" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.167824 4936 scope.go:117] "RemoveContainer" containerID="624a9313dcedb31f0b6334b576690a8d2ab6155090f147808b446dab1378062f" Sep 30 14:35:47 crc kubenswrapper[4936]: E0930 14:35:47.168143 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"624a9313dcedb31f0b6334b576690a8d2ab6155090f147808b446dab1378062f\": container with ID starting with 624a9313dcedb31f0b6334b576690a8d2ab6155090f147808b446dab1378062f not found: ID does not exist" containerID="624a9313dcedb31f0b6334b576690a8d2ab6155090f147808b446dab1378062f" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.168166 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"624a9313dcedb31f0b6334b576690a8d2ab6155090f147808b446dab1378062f"} err="failed to get container status \"624a9313dcedb31f0b6334b576690a8d2ab6155090f147808b446dab1378062f\": rpc error: code = NotFound desc = could not find container \"624a9313dcedb31f0b6334b576690a8d2ab6155090f147808b446dab1378062f\": container with ID starting with 624a9313dcedb31f0b6334b576690a8d2ab6155090f147808b446dab1378062f not found: ID does not exist" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.415019 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.424115 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.446570 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Sep 30 14:35:47 crc kubenswrapper[4936]: E0930 14:35:47.447428 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb49504-7154-45ef-bd12-9ce1d3d1cf33" containerName="manila-share" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.447523 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb49504-7154-45ef-bd12-9ce1d3d1cf33" containerName="manila-share" Sep 30 14:35:47 crc kubenswrapper[4936]: E0930 14:35:47.447627 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb49504-7154-45ef-bd12-9ce1d3d1cf33" containerName="probe" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.447719 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb49504-7154-45ef-bd12-9ce1d3d1cf33" containerName="probe" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.448013 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb49504-7154-45ef-bd12-9ce1d3d1cf33" containerName="probe" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.448103 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb49504-7154-45ef-bd12-9ce1d3d1cf33" containerName="manila-share" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.451400 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.453810 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.465111 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.481833 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b866fc884-w2td6" podUID="1e28ad1d-adf7-4316-9df6-db8a7c1e3933" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.247:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.247:8443: connect: connection refused" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.530312 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.530457 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.530540 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-ceph\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.530566 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfk6g\" (UniqueName: \"kubernetes.io/projected/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-kube-api-access-gfk6g\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.530599 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.530629 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-config-data\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.530669 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.530725 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-scripts\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.632368 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.632457 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-ceph\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.632479 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfk6g\" (UniqueName: \"kubernetes.io/projected/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-kube-api-access-gfk6g\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.632497 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.632515 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-config-data\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.632516 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.632546 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.632685 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-scripts\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.632839 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.633062 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.637541 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-scripts\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.637957 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-ceph\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.638012 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.641140 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.644707 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-config-data\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.655251 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfk6g\" (UniqueName: \"kubernetes.io/projected/a1ecfa95-cc09-43e4-8d90-a65e4f6f74de-kube-api-access-gfk6g\") pod \"manila-share-share1-0\" (UID: \"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de\") " pod="openstack/manila-share-share1-0" Sep 30 14:35:47 crc kubenswrapper[4936]: I0930 14:35:47.781490 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Sep 30 14:35:48 crc kubenswrapper[4936]: I0930 14:35:48.132019 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hk7c7" event={"ID":"8757117d-bfbe-4a6d-86c2-38b828997274","Type":"ContainerStarted","Data":"bc3199f6c381766304b1569906bdbf955d72f52eb3f9529c793ff38244bc9a83"} Sep 30 14:35:48 crc kubenswrapper[4936]: I0930 14:35:48.250541 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:35:48 crc kubenswrapper[4936]: I0930 14:35:48.250610 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:35:48 crc kubenswrapper[4936]: I0930 14:35:48.250680 4936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 14:35:48 crc kubenswrapper[4936]: I0930 14:35:48.251861 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9a8801a42a802f6d35109a7572a6ec617e571aeaa1ac23c1321a54407ac1c0a9"} pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:35:48 crc kubenswrapper[4936]: I0930 14:35:48.251989 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" containerID="cri-o://9a8801a42a802f6d35109a7572a6ec617e571aeaa1ac23c1321a54407ac1c0a9" gracePeriod=600 Sep 30 14:35:48 crc kubenswrapper[4936]: I0930 14:35:48.326369 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bb49504-7154-45ef-bd12-9ce1d3d1cf33" path="/var/lib/kubelet/pods/8bb49504-7154-45ef-bd12-9ce1d3d1cf33/volumes" Sep 30 14:35:48 crc kubenswrapper[4936]: I0930 14:35:48.526884 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hk7c7" podStartSLOduration=3.912871843 podStartE2EDuration="12.526861892s" podCreationTimestamp="2025-09-30 14:35:36 +0000 UTC" firstStartedPulling="2025-09-30 14:35:38.937737572 +0000 UTC m=+3389.321739883" lastFinishedPulling="2025-09-30 14:35:47.551727631 +0000 UTC m=+3397.935729932" observedRunningTime="2025-09-30 14:35:48.163761725 +0000 UTC m=+3398.547764056" watchObservedRunningTime="2025-09-30 14:35:48.526861892 +0000 UTC m=+3398.910864193" Sep 30 14:35:48 crc kubenswrapper[4936]: I0930 14:35:48.537158 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Sep 30 14:35:48 crc kubenswrapper[4936]: W0930 14:35:48.545715 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1ecfa95_cc09_43e4_8d90_a65e4f6f74de.slice/crio-673f4351c748b33bc065066f260adbb59844f36820f3067ee537d03eef6ec365 WatchSource:0}: Error finding container 673f4351c748b33bc065066f260adbb59844f36820f3067ee537d03eef6ec365: Status 404 returned error can't find the container with id 673f4351c748b33bc065066f260adbb59844f36820f3067ee537d03eef6ec365 Sep 30 14:35:49 crc kubenswrapper[4936]: I0930 14:35:49.146661 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de","Type":"ContainerStarted","Data":"0df2497fa47e7dfedca5c567024182e665f52b474ad6f9b35634ef9c3954cf65"} Sep 30 14:35:49 crc kubenswrapper[4936]: I0930 14:35:49.146986 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de","Type":"ContainerStarted","Data":"673f4351c748b33bc065066f260adbb59844f36820f3067ee537d03eef6ec365"} Sep 30 14:35:49 crc kubenswrapper[4936]: I0930 14:35:49.150398 4936 generic.go:334] "Generic (PLEG): container finished" podID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerID="9a8801a42a802f6d35109a7572a6ec617e571aeaa1ac23c1321a54407ac1c0a9" exitCode=0 Sep 30 14:35:49 crc kubenswrapper[4936]: I0930 14:35:49.150431 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerDied","Data":"9a8801a42a802f6d35109a7572a6ec617e571aeaa1ac23c1321a54407ac1c0a9"} Sep 30 14:35:49 crc kubenswrapper[4936]: I0930 14:35:49.150451 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a"} Sep 30 14:35:49 crc kubenswrapper[4936]: I0930 14:35:49.150468 4936 scope.go:117] "RemoveContainer" containerID="5f609aece950b743cdd5cd634e68019f96a8b445a2b22188070893eac16b9b9d" Sep 30 14:35:50 crc kubenswrapper[4936]: I0930 14:35:50.165657 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"a1ecfa95-cc09-43e4-8d90-a65e4f6f74de","Type":"ContainerStarted","Data":"7113d6fe86491fa601040f784165e9b5a0a0b80d92d59d0417bec19ee1a2d6ea"} Sep 30 14:35:50 crc kubenswrapper[4936]: I0930 14:35:50.195048 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.195029505 podStartE2EDuration="3.195029505s" podCreationTimestamp="2025-09-30 14:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:35:50.192214647 +0000 UTC m=+3400.576216968" watchObservedRunningTime="2025-09-30 14:35:50.195029505 +0000 UTC m=+3400.579031806" Sep 30 14:35:51 crc kubenswrapper[4936]: I0930 14:35:51.403271 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Sep 30 14:35:55 crc kubenswrapper[4936]: I0930 14:35:55.873622 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-sqk9d" podUID="5966eab9-ab72-4448-95de-0a079e01fb7f" containerName="registry-server" probeResult="failure" output=< Sep 30 14:35:55 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 14:35:55 crc kubenswrapper[4936]: > Sep 30 14:35:57 crc kubenswrapper[4936]: I0930 14:35:57.042233 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7c45676df6-k4rk6" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.246:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.246:8443: connect: connection refused" Sep 30 14:35:57 crc kubenswrapper[4936]: I0930 14:35:57.042485 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:35:57 crc kubenswrapper[4936]: I0930 14:35:57.043483 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"0933aabe8a2f7739dded14f44facc4f5ad453c69e3eff2f9767a972b84c427d7"} pod="openstack/horizon-7c45676df6-k4rk6" containerMessage="Container horizon failed startup probe, will be restarted" Sep 30 14:35:57 crc kubenswrapper[4936]: I0930 14:35:57.043545 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c45676df6-k4rk6" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" containerID="cri-o://0933aabe8a2f7739dded14f44facc4f5ad453c69e3eff2f9767a972b84c427d7" gracePeriod=30 Sep 30 14:35:57 crc kubenswrapper[4936]: I0930 14:35:57.105672 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hk7c7" Sep 30 14:35:57 crc kubenswrapper[4936]: I0930 14:35:57.105981 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hk7c7" Sep 30 14:35:57 crc kubenswrapper[4936]: I0930 14:35:57.481913 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b866fc884-w2td6" podUID="1e28ad1d-adf7-4316-9df6-db8a7c1e3933" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.247:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.247:8443: connect: connection refused" Sep 30 14:35:57 crc kubenswrapper[4936]: I0930 14:35:57.481997 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:35:57 crc kubenswrapper[4936]: I0930 14:35:57.482905 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"39d7b37300fc1717d0b4898373a73c768a1fe95a9a5c10542482092767aef12d"} pod="openstack/horizon-7b866fc884-w2td6" containerMessage="Container horizon failed startup probe, will be restarted" Sep 30 14:35:57 crc kubenswrapper[4936]: I0930 14:35:57.482960 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b866fc884-w2td6" podUID="1e28ad1d-adf7-4316-9df6-db8a7c1e3933" containerName="horizon" containerID="cri-o://39d7b37300fc1717d0b4898373a73c768a1fe95a9a5c10542482092767aef12d" gracePeriod=30 Sep 30 14:35:57 crc kubenswrapper[4936]: I0930 14:35:57.782767 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Sep 30 14:35:58 crc kubenswrapper[4936]: I0930 14:35:58.157028 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hk7c7" podUID="8757117d-bfbe-4a6d-86c2-38b828997274" containerName="registry-server" probeResult="failure" output=< Sep 30 14:35:58 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 14:35:58 crc kubenswrapper[4936]: > Sep 30 14:36:02 crc kubenswrapper[4936]: I0930 14:36:02.813858 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 14:36:03 crc kubenswrapper[4936]: I0930 14:36:03.088392 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Sep 30 14:36:04 crc kubenswrapper[4936]: I0930 14:36:04.875956 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sqk9d" Sep 30 14:36:04 crc kubenswrapper[4936]: I0930 14:36:04.932559 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sqk9d" Sep 30 14:36:05 crc kubenswrapper[4936]: I0930 14:36:05.119058 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sqk9d"] Sep 30 14:36:06 crc kubenswrapper[4936]: I0930 14:36:06.343835 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sqk9d" podUID="5966eab9-ab72-4448-95de-0a079e01fb7f" containerName="registry-server" containerID="cri-o://03b904933841ee44b60c9cd23a350d721487277ff2d1d682cbaeb81d2c697df0" gracePeriod=2 Sep 30 14:36:06 crc kubenswrapper[4936]: I0930 14:36:06.858296 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sqk9d" Sep 30 14:36:06 crc kubenswrapper[4936]: I0930 14:36:06.989892 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5966eab9-ab72-4448-95de-0a079e01fb7f-catalog-content\") pod \"5966eab9-ab72-4448-95de-0a079e01fb7f\" (UID: \"5966eab9-ab72-4448-95de-0a079e01fb7f\") " Sep 30 14:36:06 crc kubenswrapper[4936]: I0930 14:36:06.990010 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds88t\" (UniqueName: \"kubernetes.io/projected/5966eab9-ab72-4448-95de-0a079e01fb7f-kube-api-access-ds88t\") pod \"5966eab9-ab72-4448-95de-0a079e01fb7f\" (UID: \"5966eab9-ab72-4448-95de-0a079e01fb7f\") " Sep 30 14:36:06 crc kubenswrapper[4936]: I0930 14:36:06.990182 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5966eab9-ab72-4448-95de-0a079e01fb7f-utilities\") pod \"5966eab9-ab72-4448-95de-0a079e01fb7f\" (UID: \"5966eab9-ab72-4448-95de-0a079e01fb7f\") " Sep 30 14:36:06 crc kubenswrapper[4936]: I0930 14:36:06.990937 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5966eab9-ab72-4448-95de-0a079e01fb7f-utilities" (OuterVolumeSpecName: "utilities") pod "5966eab9-ab72-4448-95de-0a079e01fb7f" (UID: "5966eab9-ab72-4448-95de-0a079e01fb7f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:36:07 crc kubenswrapper[4936]: I0930 14:36:07.002218 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5966eab9-ab72-4448-95de-0a079e01fb7f-kube-api-access-ds88t" (OuterVolumeSpecName: "kube-api-access-ds88t") pod "5966eab9-ab72-4448-95de-0a079e01fb7f" (UID: "5966eab9-ab72-4448-95de-0a079e01fb7f"). InnerVolumeSpecName "kube-api-access-ds88t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:36:07 crc kubenswrapper[4936]: I0930 14:36:07.057495 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5966eab9-ab72-4448-95de-0a079e01fb7f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5966eab9-ab72-4448-95de-0a079e01fb7f" (UID: "5966eab9-ab72-4448-95de-0a079e01fb7f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:36:07 crc kubenswrapper[4936]: I0930 14:36:07.092403 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5966eab9-ab72-4448-95de-0a079e01fb7f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:36:07 crc kubenswrapper[4936]: I0930 14:36:07.092728 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds88t\" (UniqueName: \"kubernetes.io/projected/5966eab9-ab72-4448-95de-0a079e01fb7f-kube-api-access-ds88t\") on node \"crc\" DevicePath \"\"" Sep 30 14:36:07 crc kubenswrapper[4936]: I0930 14:36:07.092745 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5966eab9-ab72-4448-95de-0a079e01fb7f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:36:07 crc kubenswrapper[4936]: I0930 14:36:07.353268 4936 generic.go:334] "Generic (PLEG): container finished" podID="5966eab9-ab72-4448-95de-0a079e01fb7f" containerID="03b904933841ee44b60c9cd23a350d721487277ff2d1d682cbaeb81d2c697df0" exitCode=0 Sep 30 14:36:07 crc kubenswrapper[4936]: I0930 14:36:07.353312 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqk9d" event={"ID":"5966eab9-ab72-4448-95de-0a079e01fb7f","Type":"ContainerDied","Data":"03b904933841ee44b60c9cd23a350d721487277ff2d1d682cbaeb81d2c697df0"} Sep 30 14:36:07 crc kubenswrapper[4936]: I0930 14:36:07.353373 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sqk9d" Sep 30 14:36:07 crc kubenswrapper[4936]: I0930 14:36:07.353426 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqk9d" event={"ID":"5966eab9-ab72-4448-95de-0a079e01fb7f","Type":"ContainerDied","Data":"c64cdf5788ba022c335e9e86d1b28db3a2939a219354e228690470cbeb6e2d0a"} Sep 30 14:36:07 crc kubenswrapper[4936]: I0930 14:36:07.353447 4936 scope.go:117] "RemoveContainer" containerID="03b904933841ee44b60c9cd23a350d721487277ff2d1d682cbaeb81d2c697df0" Sep 30 14:36:07 crc kubenswrapper[4936]: I0930 14:36:07.377203 4936 scope.go:117] "RemoveContainer" containerID="e41f8c83dbcbca6d1af7f28fdea811d511308e7bb7ec1643c3f29d5809e23477" Sep 30 14:36:07 crc kubenswrapper[4936]: I0930 14:36:07.393872 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sqk9d"] Sep 30 14:36:07 crc kubenswrapper[4936]: I0930 14:36:07.406448 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sqk9d"] Sep 30 14:36:07 crc kubenswrapper[4936]: I0930 14:36:07.412000 4936 scope.go:117] "RemoveContainer" containerID="41dca1c7741789651eedc2dcb596d44e86e34bb2b436d62a8da3328fce91332a" Sep 30 14:36:07 crc kubenswrapper[4936]: I0930 14:36:07.460797 4936 scope.go:117] "RemoveContainer" containerID="03b904933841ee44b60c9cd23a350d721487277ff2d1d682cbaeb81d2c697df0" Sep 30 14:36:07 crc kubenswrapper[4936]: E0930 14:36:07.462070 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03b904933841ee44b60c9cd23a350d721487277ff2d1d682cbaeb81d2c697df0\": container with ID starting with 03b904933841ee44b60c9cd23a350d721487277ff2d1d682cbaeb81d2c697df0 not found: ID does not exist" containerID="03b904933841ee44b60c9cd23a350d721487277ff2d1d682cbaeb81d2c697df0" Sep 30 14:36:07 crc kubenswrapper[4936]: I0930 14:36:07.462125 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03b904933841ee44b60c9cd23a350d721487277ff2d1d682cbaeb81d2c697df0"} err="failed to get container status \"03b904933841ee44b60c9cd23a350d721487277ff2d1d682cbaeb81d2c697df0\": rpc error: code = NotFound desc = could not find container \"03b904933841ee44b60c9cd23a350d721487277ff2d1d682cbaeb81d2c697df0\": container with ID starting with 03b904933841ee44b60c9cd23a350d721487277ff2d1d682cbaeb81d2c697df0 not found: ID does not exist" Sep 30 14:36:07 crc kubenswrapper[4936]: I0930 14:36:07.462159 4936 scope.go:117] "RemoveContainer" containerID="e41f8c83dbcbca6d1af7f28fdea811d511308e7bb7ec1643c3f29d5809e23477" Sep 30 14:36:07 crc kubenswrapper[4936]: E0930 14:36:07.464090 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e41f8c83dbcbca6d1af7f28fdea811d511308e7bb7ec1643c3f29d5809e23477\": container with ID starting with e41f8c83dbcbca6d1af7f28fdea811d511308e7bb7ec1643c3f29d5809e23477 not found: ID does not exist" containerID="e41f8c83dbcbca6d1af7f28fdea811d511308e7bb7ec1643c3f29d5809e23477" Sep 30 14:36:07 crc kubenswrapper[4936]: I0930 14:36:07.464121 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e41f8c83dbcbca6d1af7f28fdea811d511308e7bb7ec1643c3f29d5809e23477"} err="failed to get container status \"e41f8c83dbcbca6d1af7f28fdea811d511308e7bb7ec1643c3f29d5809e23477\": rpc error: code = NotFound desc = could not find container \"e41f8c83dbcbca6d1af7f28fdea811d511308e7bb7ec1643c3f29d5809e23477\": container with ID starting with e41f8c83dbcbca6d1af7f28fdea811d511308e7bb7ec1643c3f29d5809e23477 not found: ID does not exist" Sep 30 14:36:07 crc kubenswrapper[4936]: I0930 14:36:07.464140 4936 scope.go:117] "RemoveContainer" containerID="41dca1c7741789651eedc2dcb596d44e86e34bb2b436d62a8da3328fce91332a" Sep 30 14:36:07 crc kubenswrapper[4936]: E0930 14:36:07.464842 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41dca1c7741789651eedc2dcb596d44e86e34bb2b436d62a8da3328fce91332a\": container with ID starting with 41dca1c7741789651eedc2dcb596d44e86e34bb2b436d62a8da3328fce91332a not found: ID does not exist" containerID="41dca1c7741789651eedc2dcb596d44e86e34bb2b436d62a8da3328fce91332a" Sep 30 14:36:07 crc kubenswrapper[4936]: I0930 14:36:07.464866 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41dca1c7741789651eedc2dcb596d44e86e34bb2b436d62a8da3328fce91332a"} err="failed to get container status \"41dca1c7741789651eedc2dcb596d44e86e34bb2b436d62a8da3328fce91332a\": rpc error: code = NotFound desc = could not find container \"41dca1c7741789651eedc2dcb596d44e86e34bb2b436d62a8da3328fce91332a\": container with ID starting with 41dca1c7741789651eedc2dcb596d44e86e34bb2b436d62a8da3328fce91332a not found: ID does not exist" Sep 30 14:36:08 crc kubenswrapper[4936]: I0930 14:36:08.158545 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hk7c7" podUID="8757117d-bfbe-4a6d-86c2-38b828997274" containerName="registry-server" probeResult="failure" output=< Sep 30 14:36:08 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 14:36:08 crc kubenswrapper[4936]: > Sep 30 14:36:08 crc kubenswrapper[4936]: I0930 14:36:08.328845 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5966eab9-ab72-4448-95de-0a079e01fb7f" path="/var/lib/kubelet/pods/5966eab9-ab72-4448-95de-0a079e01fb7f/volumes" Sep 30 14:36:09 crc kubenswrapper[4936]: I0930 14:36:09.588656 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Sep 30 14:36:17 crc kubenswrapper[4936]: I0930 14:36:17.156039 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hk7c7" Sep 30 14:36:17 crc kubenswrapper[4936]: I0930 14:36:17.204563 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hk7c7" Sep 30 14:36:17 crc kubenswrapper[4936]: I0930 14:36:17.394899 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hk7c7"] Sep 30 14:36:18 crc kubenswrapper[4936]: I0930 14:36:18.461639 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hk7c7" podUID="8757117d-bfbe-4a6d-86c2-38b828997274" containerName="registry-server" containerID="cri-o://bc3199f6c381766304b1569906bdbf955d72f52eb3f9529c793ff38244bc9a83" gracePeriod=2 Sep 30 14:36:18 crc kubenswrapper[4936]: I0930 14:36:18.936849 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hk7c7" Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.033918 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8757117d-bfbe-4a6d-86c2-38b828997274-catalog-content\") pod \"8757117d-bfbe-4a6d-86c2-38b828997274\" (UID: \"8757117d-bfbe-4a6d-86c2-38b828997274\") " Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.033963 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8757117d-bfbe-4a6d-86c2-38b828997274-utilities\") pod \"8757117d-bfbe-4a6d-86c2-38b828997274\" (UID: \"8757117d-bfbe-4a6d-86c2-38b828997274\") " Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.034110 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxwpm\" (UniqueName: \"kubernetes.io/projected/8757117d-bfbe-4a6d-86c2-38b828997274-kube-api-access-vxwpm\") pod \"8757117d-bfbe-4a6d-86c2-38b828997274\" (UID: \"8757117d-bfbe-4a6d-86c2-38b828997274\") " Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.034936 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8757117d-bfbe-4a6d-86c2-38b828997274-utilities" (OuterVolumeSpecName: "utilities") pod "8757117d-bfbe-4a6d-86c2-38b828997274" (UID: "8757117d-bfbe-4a6d-86c2-38b828997274"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.039462 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8757117d-bfbe-4a6d-86c2-38b828997274-kube-api-access-vxwpm" (OuterVolumeSpecName: "kube-api-access-vxwpm") pod "8757117d-bfbe-4a6d-86c2-38b828997274" (UID: "8757117d-bfbe-4a6d-86c2-38b828997274"). InnerVolumeSpecName "kube-api-access-vxwpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.119507 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8757117d-bfbe-4a6d-86c2-38b828997274-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8757117d-bfbe-4a6d-86c2-38b828997274" (UID: "8757117d-bfbe-4a6d-86c2-38b828997274"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.136865 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8757117d-bfbe-4a6d-86c2-38b828997274-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.137108 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8757117d-bfbe-4a6d-86c2-38b828997274-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.137151 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxwpm\" (UniqueName: \"kubernetes.io/projected/8757117d-bfbe-4a6d-86c2-38b828997274-kube-api-access-vxwpm\") on node \"crc\" DevicePath \"\"" Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.472256 4936 generic.go:334] "Generic (PLEG): container finished" podID="8757117d-bfbe-4a6d-86c2-38b828997274" containerID="bc3199f6c381766304b1569906bdbf955d72f52eb3f9529c793ff38244bc9a83" exitCode=0 Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.472301 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hk7c7" event={"ID":"8757117d-bfbe-4a6d-86c2-38b828997274","Type":"ContainerDied","Data":"bc3199f6c381766304b1569906bdbf955d72f52eb3f9529c793ff38244bc9a83"} Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.472326 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hk7c7" Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.472363 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hk7c7" event={"ID":"8757117d-bfbe-4a6d-86c2-38b828997274","Type":"ContainerDied","Data":"da55c72338635390355a1cdbef598de52100093f4415aebd68fa0d1dc9141eb4"} Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.472388 4936 scope.go:117] "RemoveContainer" containerID="bc3199f6c381766304b1569906bdbf955d72f52eb3f9529c793ff38244bc9a83" Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.520406 4936 scope.go:117] "RemoveContainer" containerID="66ee0662281fa0a3d03cae0f8e16599ce84f922fb7825cb64dd20649728ce15d" Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.523740 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hk7c7"] Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.534236 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hk7c7"] Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.549536 4936 scope.go:117] "RemoveContainer" containerID="8806ede1b27274b37e062741a614b64e6e6d647927d41db2b16e990499f51f8f" Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.593086 4936 scope.go:117] "RemoveContainer" containerID="bc3199f6c381766304b1569906bdbf955d72f52eb3f9529c793ff38244bc9a83" Sep 30 14:36:19 crc kubenswrapper[4936]: E0930 14:36:19.593824 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc3199f6c381766304b1569906bdbf955d72f52eb3f9529c793ff38244bc9a83\": container with ID starting with bc3199f6c381766304b1569906bdbf955d72f52eb3f9529c793ff38244bc9a83 not found: ID does not exist" containerID="bc3199f6c381766304b1569906bdbf955d72f52eb3f9529c793ff38244bc9a83" Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.593866 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc3199f6c381766304b1569906bdbf955d72f52eb3f9529c793ff38244bc9a83"} err="failed to get container status \"bc3199f6c381766304b1569906bdbf955d72f52eb3f9529c793ff38244bc9a83\": rpc error: code = NotFound desc = could not find container \"bc3199f6c381766304b1569906bdbf955d72f52eb3f9529c793ff38244bc9a83\": container with ID starting with bc3199f6c381766304b1569906bdbf955d72f52eb3f9529c793ff38244bc9a83 not found: ID does not exist" Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.593897 4936 scope.go:117] "RemoveContainer" containerID="66ee0662281fa0a3d03cae0f8e16599ce84f922fb7825cb64dd20649728ce15d" Sep 30 14:36:19 crc kubenswrapper[4936]: E0930 14:36:19.594685 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66ee0662281fa0a3d03cae0f8e16599ce84f922fb7825cb64dd20649728ce15d\": container with ID starting with 66ee0662281fa0a3d03cae0f8e16599ce84f922fb7825cb64dd20649728ce15d not found: ID does not exist" containerID="66ee0662281fa0a3d03cae0f8e16599ce84f922fb7825cb64dd20649728ce15d" Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.594720 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66ee0662281fa0a3d03cae0f8e16599ce84f922fb7825cb64dd20649728ce15d"} err="failed to get container status \"66ee0662281fa0a3d03cae0f8e16599ce84f922fb7825cb64dd20649728ce15d\": rpc error: code = NotFound desc = could not find container \"66ee0662281fa0a3d03cae0f8e16599ce84f922fb7825cb64dd20649728ce15d\": container with ID starting with 66ee0662281fa0a3d03cae0f8e16599ce84f922fb7825cb64dd20649728ce15d not found: ID does not exist" Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.594739 4936 scope.go:117] "RemoveContainer" containerID="8806ede1b27274b37e062741a614b64e6e6d647927d41db2b16e990499f51f8f" Sep 30 14:36:19 crc kubenswrapper[4936]: E0930 14:36:19.595036 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8806ede1b27274b37e062741a614b64e6e6d647927d41db2b16e990499f51f8f\": container with ID starting with 8806ede1b27274b37e062741a614b64e6e6d647927d41db2b16e990499f51f8f not found: ID does not exist" containerID="8806ede1b27274b37e062741a614b64e6e6d647927d41db2b16e990499f51f8f" Sep 30 14:36:19 crc kubenswrapper[4936]: I0930 14:36:19.595081 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8806ede1b27274b37e062741a614b64e6e6d647927d41db2b16e990499f51f8f"} err="failed to get container status \"8806ede1b27274b37e062741a614b64e6e6d647927d41db2b16e990499f51f8f\": rpc error: code = NotFound desc = could not find container \"8806ede1b27274b37e062741a614b64e6e6d647927d41db2b16e990499f51f8f\": container with ID starting with 8806ede1b27274b37e062741a614b64e6e6d647927d41db2b16e990499f51f8f not found: ID does not exist" Sep 30 14:36:20 crc kubenswrapper[4936]: I0930 14:36:20.327486 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8757117d-bfbe-4a6d-86c2-38b828997274" path="/var/lib/kubelet/pods/8757117d-bfbe-4a6d-86c2-38b828997274/volumes" Sep 30 14:36:27 crc kubenswrapper[4936]: I0930 14:36:27.610939 4936 generic.go:334] "Generic (PLEG): container finished" podID="bfa0e282-87b9-4509-ad57-429aa110b324" containerID="0933aabe8a2f7739dded14f44facc4f5ad453c69e3eff2f9767a972b84c427d7" exitCode=137 Sep 30 14:36:27 crc kubenswrapper[4936]: I0930 14:36:27.611489 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c45676df6-k4rk6" event={"ID":"bfa0e282-87b9-4509-ad57-429aa110b324","Type":"ContainerDied","Data":"0933aabe8a2f7739dded14f44facc4f5ad453c69e3eff2f9767a972b84c427d7"} Sep 30 14:36:27 crc kubenswrapper[4936]: I0930 14:36:27.611521 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c45676df6-k4rk6" event={"ID":"bfa0e282-87b9-4509-ad57-429aa110b324","Type":"ContainerStarted","Data":"ffc648d8ae8f71600ad3670cd14cccf3c33a2660d0aa2b30d0c549bb7599e09a"} Sep 30 14:36:27 crc kubenswrapper[4936]: I0930 14:36:27.611537 4936 scope.go:117] "RemoveContainer" containerID="16366588f2d33c86e7f8823a43493c3e6b00d89b4c8628779ccffe5c17a02362" Sep 30 14:36:28 crc kubenswrapper[4936]: I0930 14:36:28.631415 4936 generic.go:334] "Generic (PLEG): container finished" podID="1e28ad1d-adf7-4316-9df6-db8a7c1e3933" containerID="39d7b37300fc1717d0b4898373a73c768a1fe95a9a5c10542482092767aef12d" exitCode=137 Sep 30 14:36:28 crc kubenswrapper[4936]: I0930 14:36:28.631739 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b866fc884-w2td6" event={"ID":"1e28ad1d-adf7-4316-9df6-db8a7c1e3933","Type":"ContainerDied","Data":"39d7b37300fc1717d0b4898373a73c768a1fe95a9a5c10542482092767aef12d"} Sep 30 14:36:28 crc kubenswrapper[4936]: I0930 14:36:28.631811 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b866fc884-w2td6" event={"ID":"1e28ad1d-adf7-4316-9df6-db8a7c1e3933","Type":"ContainerStarted","Data":"89252c4feb80c06cd4627f372dfd4c3355b358a9f3da81f76afe64361b5f4654"} Sep 30 14:36:28 crc kubenswrapper[4936]: I0930 14:36:28.631829 4936 scope.go:117] "RemoveContainer" containerID="4a94f16fe2840424daa44165b9f680e409954108884ef70879ace7ca6c302cf2" Sep 30 14:36:37 crc kubenswrapper[4936]: I0930 14:36:37.041818 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:36:37 crc kubenswrapper[4936]: I0930 14:36:37.042486 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:36:37 crc kubenswrapper[4936]: I0930 14:36:37.042957 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7c45676df6-k4rk6" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.246:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.246:8443: connect: connection refused" Sep 30 14:36:37 crc kubenswrapper[4936]: I0930 14:36:37.480635 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:36:37 crc kubenswrapper[4936]: I0930 14:36:37.480682 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:36:37 crc kubenswrapper[4936]: I0930 14:36:37.481915 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b866fc884-w2td6" podUID="1e28ad1d-adf7-4316-9df6-db8a7c1e3933" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.247:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.247:8443: connect: connection refused" Sep 30 14:36:47 crc kubenswrapper[4936]: I0930 14:36:47.042694 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7c45676df6-k4rk6" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.246:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.246:8443: connect: connection refused" Sep 30 14:36:47 crc kubenswrapper[4936]: I0930 14:36:47.479793 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b866fc884-w2td6" podUID="1e28ad1d-adf7-4316-9df6-db8a7c1e3933" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.247:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.247:8443: connect: connection refused" Sep 30 14:36:59 crc kubenswrapper[4936]: I0930 14:36:59.437225 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:36:59 crc kubenswrapper[4936]: I0930 14:36:59.444851 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:37:01 crc kubenswrapper[4936]: I0930 14:37:01.164055 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7b866fc884-w2td6" Sep 30 14:37:01 crc kubenswrapper[4936]: I0930 14:37:01.240105 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c45676df6-k4rk6"] Sep 30 14:37:01 crc kubenswrapper[4936]: I0930 14:37:01.240333 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c45676df6-k4rk6" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon-log" containerID="cri-o://f65a829414e0aff49a640721be56f1daee31231f70add732897b553b8335767a" gracePeriod=30 Sep 30 14:37:01 crc kubenswrapper[4936]: I0930 14:37:01.240484 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c45676df6-k4rk6" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" containerID="cri-o://ffc648d8ae8f71600ad3670cd14cccf3c33a2660d0aa2b30d0c549bb7599e09a" gracePeriod=30 Sep 30 14:37:01 crc kubenswrapper[4936]: I0930 14:37:01.255002 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7c45676df6-k4rk6" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.246:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Sep 30 14:37:01 crc kubenswrapper[4936]: I0930 14:37:01.259309 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7c45676df6-k4rk6" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.246:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Sep 30 14:37:05 crc kubenswrapper[4936]: I0930 14:37:05.000048 4936 generic.go:334] "Generic (PLEG): container finished" podID="bfa0e282-87b9-4509-ad57-429aa110b324" containerID="ffc648d8ae8f71600ad3670cd14cccf3c33a2660d0aa2b30d0c549bb7599e09a" exitCode=0 Sep 30 14:37:05 crc kubenswrapper[4936]: I0930 14:37:05.000126 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c45676df6-k4rk6" event={"ID":"bfa0e282-87b9-4509-ad57-429aa110b324","Type":"ContainerDied","Data":"ffc648d8ae8f71600ad3670cd14cccf3c33a2660d0aa2b30d0c549bb7599e09a"} Sep 30 14:37:05 crc kubenswrapper[4936]: I0930 14:37:05.000696 4936 scope.go:117] "RemoveContainer" containerID="0933aabe8a2f7739dded14f44facc4f5ad453c69e3eff2f9767a972b84c427d7" Sep 30 14:37:07 crc kubenswrapper[4936]: I0930 14:37:07.042405 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7c45676df6-k4rk6" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.246:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.246:8443: connect: connection refused" Sep 30 14:37:17 crc kubenswrapper[4936]: I0930 14:37:17.041945 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7c45676df6-k4rk6" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.246:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.246:8443: connect: connection refused" Sep 30 14:37:27 crc kubenswrapper[4936]: I0930 14:37:27.041683 4936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7c45676df6-k4rk6" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.246:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.246:8443: connect: connection refused" Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.714098 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.834086 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfa0e282-87b9-4509-ad57-429aa110b324-combined-ca-bundle\") pod \"bfa0e282-87b9-4509-ad57-429aa110b324\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.834138 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bfa0e282-87b9-4509-ad57-429aa110b324-horizon-secret-key\") pod \"bfa0e282-87b9-4509-ad57-429aa110b324\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.834206 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfa0e282-87b9-4509-ad57-429aa110b324-scripts\") pod \"bfa0e282-87b9-4509-ad57-429aa110b324\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.834414 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfa0e282-87b9-4509-ad57-429aa110b324-logs\") pod \"bfa0e282-87b9-4509-ad57-429aa110b324\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.834470 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4s6gw\" (UniqueName: \"kubernetes.io/projected/bfa0e282-87b9-4509-ad57-429aa110b324-kube-api-access-4s6gw\") pod \"bfa0e282-87b9-4509-ad57-429aa110b324\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.834519 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfa0e282-87b9-4509-ad57-429aa110b324-config-data\") pod \"bfa0e282-87b9-4509-ad57-429aa110b324\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.834549 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfa0e282-87b9-4509-ad57-429aa110b324-horizon-tls-certs\") pod \"bfa0e282-87b9-4509-ad57-429aa110b324\" (UID: \"bfa0e282-87b9-4509-ad57-429aa110b324\") " Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.834865 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfa0e282-87b9-4509-ad57-429aa110b324-logs" (OuterVolumeSpecName: "logs") pod "bfa0e282-87b9-4509-ad57-429aa110b324" (UID: "bfa0e282-87b9-4509-ad57-429aa110b324"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.835012 4936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfa0e282-87b9-4509-ad57-429aa110b324-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.850743 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfa0e282-87b9-4509-ad57-429aa110b324-kube-api-access-4s6gw" (OuterVolumeSpecName: "kube-api-access-4s6gw") pod "bfa0e282-87b9-4509-ad57-429aa110b324" (UID: "bfa0e282-87b9-4509-ad57-429aa110b324"). InnerVolumeSpecName "kube-api-access-4s6gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.860085 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa0e282-87b9-4509-ad57-429aa110b324-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "bfa0e282-87b9-4509-ad57-429aa110b324" (UID: "bfa0e282-87b9-4509-ad57-429aa110b324"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.867803 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa0e282-87b9-4509-ad57-429aa110b324-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfa0e282-87b9-4509-ad57-429aa110b324" (UID: "bfa0e282-87b9-4509-ad57-429aa110b324"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.875931 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa0e282-87b9-4509-ad57-429aa110b324-scripts" (OuterVolumeSpecName: "scripts") pod "bfa0e282-87b9-4509-ad57-429aa110b324" (UID: "bfa0e282-87b9-4509-ad57-429aa110b324"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.882020 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa0e282-87b9-4509-ad57-429aa110b324-config-data" (OuterVolumeSpecName: "config-data") pod "bfa0e282-87b9-4509-ad57-429aa110b324" (UID: "bfa0e282-87b9-4509-ad57-429aa110b324"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.897914 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa0e282-87b9-4509-ad57-429aa110b324-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "bfa0e282-87b9-4509-ad57-429aa110b324" (UID: "bfa0e282-87b9-4509-ad57-429aa110b324"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.936672 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4s6gw\" (UniqueName: \"kubernetes.io/projected/bfa0e282-87b9-4509-ad57-429aa110b324-kube-api-access-4s6gw\") on node \"crc\" DevicePath \"\"" Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.936711 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfa0e282-87b9-4509-ad57-429aa110b324-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.936722 4936 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfa0e282-87b9-4509-ad57-429aa110b324-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.936732 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfa0e282-87b9-4509-ad57-429aa110b324-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.936741 4936 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bfa0e282-87b9-4509-ad57-429aa110b324-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:37:31 crc kubenswrapper[4936]: I0930 14:37:31.936749 4936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfa0e282-87b9-4509-ad57-429aa110b324-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:37:32 crc kubenswrapper[4936]: I0930 14:37:32.261724 4936 generic.go:334] "Generic (PLEG): container finished" podID="bfa0e282-87b9-4509-ad57-429aa110b324" containerID="f65a829414e0aff49a640721be56f1daee31231f70add732897b553b8335767a" exitCode=137 Sep 30 14:37:32 crc kubenswrapper[4936]: I0930 14:37:32.261767 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c45676df6-k4rk6" event={"ID":"bfa0e282-87b9-4509-ad57-429aa110b324","Type":"ContainerDied","Data":"f65a829414e0aff49a640721be56f1daee31231f70add732897b553b8335767a"} Sep 30 14:37:32 crc kubenswrapper[4936]: I0930 14:37:32.262075 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c45676df6-k4rk6" event={"ID":"bfa0e282-87b9-4509-ad57-429aa110b324","Type":"ContainerDied","Data":"1d92d0fbf462a16c2ee0f9999b85efc6fdfc5d0d1a9c47924d759aa4cdc850e5"} Sep 30 14:37:32 crc kubenswrapper[4936]: I0930 14:37:32.262101 4936 scope.go:117] "RemoveContainer" containerID="ffc648d8ae8f71600ad3670cd14cccf3c33a2660d0aa2b30d0c549bb7599e09a" Sep 30 14:37:32 crc kubenswrapper[4936]: I0930 14:37:32.261814 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c45676df6-k4rk6" Sep 30 14:37:32 crc kubenswrapper[4936]: I0930 14:37:32.295273 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c45676df6-k4rk6"] Sep 30 14:37:32 crc kubenswrapper[4936]: I0930 14:37:32.301968 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c45676df6-k4rk6"] Sep 30 14:37:32 crc kubenswrapper[4936]: I0930 14:37:32.362015 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" path="/var/lib/kubelet/pods/bfa0e282-87b9-4509-ad57-429aa110b324/volumes" Sep 30 14:37:32 crc kubenswrapper[4936]: I0930 14:37:32.434873 4936 scope.go:117] "RemoveContainer" containerID="f65a829414e0aff49a640721be56f1daee31231f70add732897b553b8335767a" Sep 30 14:37:32 crc kubenswrapper[4936]: I0930 14:37:32.457622 4936 scope.go:117] "RemoveContainer" containerID="ffc648d8ae8f71600ad3670cd14cccf3c33a2660d0aa2b30d0c549bb7599e09a" Sep 30 14:37:32 crc kubenswrapper[4936]: E0930 14:37:32.458226 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc648d8ae8f71600ad3670cd14cccf3c33a2660d0aa2b30d0c549bb7599e09a\": container with ID starting with ffc648d8ae8f71600ad3670cd14cccf3c33a2660d0aa2b30d0c549bb7599e09a not found: ID does not exist" containerID="ffc648d8ae8f71600ad3670cd14cccf3c33a2660d0aa2b30d0c549bb7599e09a" Sep 30 14:37:32 crc kubenswrapper[4936]: I0930 14:37:32.458298 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc648d8ae8f71600ad3670cd14cccf3c33a2660d0aa2b30d0c549bb7599e09a"} err="failed to get container status \"ffc648d8ae8f71600ad3670cd14cccf3c33a2660d0aa2b30d0c549bb7599e09a\": rpc error: code = NotFound desc = could not find container \"ffc648d8ae8f71600ad3670cd14cccf3c33a2660d0aa2b30d0c549bb7599e09a\": container with ID starting with ffc648d8ae8f71600ad3670cd14cccf3c33a2660d0aa2b30d0c549bb7599e09a not found: ID does not exist" Sep 30 14:37:32 crc kubenswrapper[4936]: I0930 14:37:32.458327 4936 scope.go:117] "RemoveContainer" containerID="f65a829414e0aff49a640721be56f1daee31231f70add732897b553b8335767a" Sep 30 14:37:32 crc kubenswrapper[4936]: E0930 14:37:32.458842 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f65a829414e0aff49a640721be56f1daee31231f70add732897b553b8335767a\": container with ID starting with f65a829414e0aff49a640721be56f1daee31231f70add732897b553b8335767a not found: ID does not exist" containerID="f65a829414e0aff49a640721be56f1daee31231f70add732897b553b8335767a" Sep 30 14:37:32 crc kubenswrapper[4936]: I0930 14:37:32.458885 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f65a829414e0aff49a640721be56f1daee31231f70add732897b553b8335767a"} err="failed to get container status \"f65a829414e0aff49a640721be56f1daee31231f70add732897b553b8335767a\": rpc error: code = NotFound desc = could not find container \"f65a829414e0aff49a640721be56f1daee31231f70add732897b553b8335767a\": container with ID starting with f65a829414e0aff49a640721be56f1daee31231f70add732897b553b8335767a not found: ID does not exist" Sep 30 14:37:48 crc kubenswrapper[4936]: I0930 14:37:48.250709 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:37:48 crc kubenswrapper[4936]: I0930 14:37:48.251189 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.290021 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 14:38:11 crc kubenswrapper[4936]: E0930 14:38:11.291125 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8757117d-bfbe-4a6d-86c2-38b828997274" containerName="extract-content" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.291140 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="8757117d-bfbe-4a6d-86c2-38b828997274" containerName="extract-content" Sep 30 14:38:11 crc kubenswrapper[4936]: E0930 14:38:11.291159 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5966eab9-ab72-4448-95de-0a079e01fb7f" containerName="extract-content" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.291165 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5966eab9-ab72-4448-95de-0a079e01fb7f" containerName="extract-content" Sep 30 14:38:11 crc kubenswrapper[4936]: E0930 14:38:11.291176 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8757117d-bfbe-4a6d-86c2-38b828997274" containerName="registry-server" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.291182 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="8757117d-bfbe-4a6d-86c2-38b828997274" containerName="registry-server" Sep 30 14:38:11 crc kubenswrapper[4936]: E0930 14:38:11.291189 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon-log" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.291195 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon-log" Sep 30 14:38:11 crc kubenswrapper[4936]: E0930 14:38:11.291207 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.291213 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" Sep 30 14:38:11 crc kubenswrapper[4936]: E0930 14:38:11.291222 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8757117d-bfbe-4a6d-86c2-38b828997274" containerName="extract-utilities" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.291228 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="8757117d-bfbe-4a6d-86c2-38b828997274" containerName="extract-utilities" Sep 30 14:38:11 crc kubenswrapper[4936]: E0930 14:38:11.291243 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.291248 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" Sep 30 14:38:11 crc kubenswrapper[4936]: E0930 14:38:11.291256 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5966eab9-ab72-4448-95de-0a079e01fb7f" containerName="registry-server" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.291263 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5966eab9-ab72-4448-95de-0a079e01fb7f" containerName="registry-server" Sep 30 14:38:11 crc kubenswrapper[4936]: E0930 14:38:11.291277 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5966eab9-ab72-4448-95de-0a079e01fb7f" containerName="extract-utilities" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.291282 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5966eab9-ab72-4448-95de-0a079e01fb7f" containerName="extract-utilities" Sep 30 14:38:11 crc kubenswrapper[4936]: E0930 14:38:11.291295 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.291300 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.291478 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="8757117d-bfbe-4a6d-86c2-38b828997274" containerName="registry-server" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.291487 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.291496 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5966eab9-ab72-4448-95de-0a079e01fb7f" containerName="registry-server" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.291505 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon-log" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.291525 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.292162 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.295178 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-4zkrs" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.295386 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.295961 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.296006 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.308874 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.335258 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/87e335f7-bd98-45d0-a733-b2fc2dd3076e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.335462 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87e335f7-bd98-45d0-a733-b2fc2dd3076e-config-data\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.335506 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/87e335f7-bd98-45d0-a733-b2fc2dd3076e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.438282 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/87e335f7-bd98-45d0-a733-b2fc2dd3076e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.438672 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87e335f7-bd98-45d0-a733-b2fc2dd3076e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.438909 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/87e335f7-bd98-45d0-a733-b2fc2dd3076e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.439047 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/87e335f7-bd98-45d0-a733-b2fc2dd3076e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.439284 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87e335f7-bd98-45d0-a733-b2fc2dd3076e-config-data\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.439425 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v64cj\" (UniqueName: \"kubernetes.io/projected/87e335f7-bd98-45d0-a733-b2fc2dd3076e-kube-api-access-v64cj\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.439542 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/87e335f7-bd98-45d0-a733-b2fc2dd3076e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.439708 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/87e335f7-bd98-45d0-a733-b2fc2dd3076e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.439819 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.440639 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87e335f7-bd98-45d0-a733-b2fc2dd3076e-config-data\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.440745 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/87e335f7-bd98-45d0-a733-b2fc2dd3076e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.449010 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/87e335f7-bd98-45d0-a733-b2fc2dd3076e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.542089 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/87e335f7-bd98-45d0-a733-b2fc2dd3076e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.542186 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v64cj\" (UniqueName: \"kubernetes.io/projected/87e335f7-bd98-45d0-a733-b2fc2dd3076e-kube-api-access-v64cj\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.542262 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/87e335f7-bd98-45d0-a733-b2fc2dd3076e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.542298 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.542327 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/87e335f7-bd98-45d0-a733-b2fc2dd3076e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.542395 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87e335f7-bd98-45d0-a733-b2fc2dd3076e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.542574 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/87e335f7-bd98-45d0-a733-b2fc2dd3076e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.543040 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/87e335f7-bd98-45d0-a733-b2fc2dd3076e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.544810 4936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.554918 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87e335f7-bd98-45d0-a733-b2fc2dd3076e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.555097 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/87e335f7-bd98-45d0-a733-b2fc2dd3076e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.561093 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v64cj\" (UniqueName: \"kubernetes.io/projected/87e335f7-bd98-45d0-a733-b2fc2dd3076e-kube-api-access-v64cj\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.577555 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " pod="openstack/tempest-tests-tempest" Sep 30 14:38:11 crc kubenswrapper[4936]: I0930 14:38:11.618250 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 14:38:12 crc kubenswrapper[4936]: I0930 14:38:12.116270 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 14:38:12 crc kubenswrapper[4936]: I0930 14:38:12.717947 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"87e335f7-bd98-45d0-a733-b2fc2dd3076e","Type":"ContainerStarted","Data":"ec99baf5358c4ba9be85a2d451817edc3aba473498adcb848bfca6b6df0f0e24"} Sep 30 14:38:18 crc kubenswrapper[4936]: I0930 14:38:18.250630 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:38:18 crc kubenswrapper[4936]: I0930 14:38:18.251384 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:38:48 crc kubenswrapper[4936]: I0930 14:38:48.249889 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:38:48 crc kubenswrapper[4936]: I0930 14:38:48.250487 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:38:48 crc kubenswrapper[4936]: I0930 14:38:48.250533 4936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 14:38:48 crc kubenswrapper[4936]: I0930 14:38:48.251227 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a"} pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:38:48 crc kubenswrapper[4936]: I0930 14:38:48.251272 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" containerID="cri-o://32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" gracePeriod=600 Sep 30 14:38:49 crc kubenswrapper[4936]: I0930 14:38:49.080821 4936 generic.go:334] "Generic (PLEG): container finished" podID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" exitCode=0 Sep 30 14:38:49 crc kubenswrapper[4936]: I0930 14:38:49.081131 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerDied","Data":"32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a"} Sep 30 14:38:49 crc kubenswrapper[4936]: I0930 14:38:49.081162 4936 scope.go:117] "RemoveContainer" containerID="9a8801a42a802f6d35109a7572a6ec617e571aeaa1ac23c1321a54407ac1c0a9" Sep 30 14:38:50 crc kubenswrapper[4936]: E0930 14:38:50.383945 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:38:50 crc kubenswrapper[4936]: E0930 14:38:50.465134 4936 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Sep 30 14:38:50 crc kubenswrapper[4936]: E0930 14:38:50.471174 4936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v64cj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(87e335f7-bd98-45d0-a733-b2fc2dd3076e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 14:38:50 crc kubenswrapper[4936]: E0930 14:38:50.472383 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="87e335f7-bd98-45d0-a733-b2fc2dd3076e" Sep 30 14:38:51 crc kubenswrapper[4936]: I0930 14:38:51.103925 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:38:51 crc kubenswrapper[4936]: E0930 14:38:51.104199 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:38:51 crc kubenswrapper[4936]: E0930 14:38:51.104916 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="87e335f7-bd98-45d0-a733-b2fc2dd3076e" Sep 30 14:39:04 crc kubenswrapper[4936]: I0930 14:39:04.315384 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:39:04 crc kubenswrapper[4936]: E0930 14:39:04.316141 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:39:05 crc kubenswrapper[4936]: I0930 14:39:05.321435 4936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:39:07 crc kubenswrapper[4936]: I0930 14:39:07.279565 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"87e335f7-bd98-45d0-a733-b2fc2dd3076e","Type":"ContainerStarted","Data":"b3be5ab12030d972a9d2d3d41911c4778b4f544b9b6f798fe40f380b751113dc"} Sep 30 14:39:07 crc kubenswrapper[4936]: I0930 14:39:07.299071 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.645933236 podStartE2EDuration="57.299056404s" podCreationTimestamp="2025-09-30 14:38:10 +0000 UTC" firstStartedPulling="2025-09-30 14:38:12.12266415 +0000 UTC m=+3542.506666451" lastFinishedPulling="2025-09-30 14:39:05.775787318 +0000 UTC m=+3596.159789619" observedRunningTime="2025-09-30 14:39:07.295307851 +0000 UTC m=+3597.679310152" watchObservedRunningTime="2025-09-30 14:39:07.299056404 +0000 UTC m=+3597.683058705" Sep 30 14:39:12 crc kubenswrapper[4936]: I0930 14:39:12.927738 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h5z8f"] Sep 30 14:39:12 crc kubenswrapper[4936]: I0930 14:39:12.928802 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfa0e282-87b9-4509-ad57-429aa110b324" containerName="horizon" Sep 30 14:39:12 crc kubenswrapper[4936]: I0930 14:39:12.930085 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5z8f" Sep 30 14:39:12 crc kubenswrapper[4936]: I0930 14:39:12.939854 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5z8f"] Sep 30 14:39:12 crc kubenswrapper[4936]: I0930 14:39:12.997812 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58d41775-00b6-4416-bdea-cc975b46d701-utilities\") pod \"redhat-marketplace-h5z8f\" (UID: \"58d41775-00b6-4416-bdea-cc975b46d701\") " pod="openshift-marketplace/redhat-marketplace-h5z8f" Sep 30 14:39:12 crc kubenswrapper[4936]: I0930 14:39:12.997891 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58d41775-00b6-4416-bdea-cc975b46d701-catalog-content\") pod \"redhat-marketplace-h5z8f\" (UID: \"58d41775-00b6-4416-bdea-cc975b46d701\") " pod="openshift-marketplace/redhat-marketplace-h5z8f" Sep 30 14:39:12 crc kubenswrapper[4936]: I0930 14:39:12.997918 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k72qj\" (UniqueName: \"kubernetes.io/projected/58d41775-00b6-4416-bdea-cc975b46d701-kube-api-access-k72qj\") pod \"redhat-marketplace-h5z8f\" (UID: \"58d41775-00b6-4416-bdea-cc975b46d701\") " pod="openshift-marketplace/redhat-marketplace-h5z8f" Sep 30 14:39:13 crc kubenswrapper[4936]: I0930 14:39:13.099562 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58d41775-00b6-4416-bdea-cc975b46d701-utilities\") pod \"redhat-marketplace-h5z8f\" (UID: \"58d41775-00b6-4416-bdea-cc975b46d701\") " pod="openshift-marketplace/redhat-marketplace-h5z8f" Sep 30 14:39:13 crc kubenswrapper[4936]: I0930 14:39:13.099847 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58d41775-00b6-4416-bdea-cc975b46d701-catalog-content\") pod \"redhat-marketplace-h5z8f\" (UID: \"58d41775-00b6-4416-bdea-cc975b46d701\") " pod="openshift-marketplace/redhat-marketplace-h5z8f" Sep 30 14:39:13 crc kubenswrapper[4936]: I0930 14:39:13.099944 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k72qj\" (UniqueName: \"kubernetes.io/projected/58d41775-00b6-4416-bdea-cc975b46d701-kube-api-access-k72qj\") pod \"redhat-marketplace-h5z8f\" (UID: \"58d41775-00b6-4416-bdea-cc975b46d701\") " pod="openshift-marketplace/redhat-marketplace-h5z8f" Sep 30 14:39:13 crc kubenswrapper[4936]: I0930 14:39:13.100065 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58d41775-00b6-4416-bdea-cc975b46d701-utilities\") pod \"redhat-marketplace-h5z8f\" (UID: \"58d41775-00b6-4416-bdea-cc975b46d701\") " pod="openshift-marketplace/redhat-marketplace-h5z8f" Sep 30 14:39:13 crc kubenswrapper[4936]: I0930 14:39:13.100205 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58d41775-00b6-4416-bdea-cc975b46d701-catalog-content\") pod \"redhat-marketplace-h5z8f\" (UID: \"58d41775-00b6-4416-bdea-cc975b46d701\") " pod="openshift-marketplace/redhat-marketplace-h5z8f" Sep 30 14:39:13 crc kubenswrapper[4936]: I0930 14:39:13.129531 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k72qj\" (UniqueName: \"kubernetes.io/projected/58d41775-00b6-4416-bdea-cc975b46d701-kube-api-access-k72qj\") pod \"redhat-marketplace-h5z8f\" (UID: \"58d41775-00b6-4416-bdea-cc975b46d701\") " pod="openshift-marketplace/redhat-marketplace-h5z8f" Sep 30 14:39:13 crc kubenswrapper[4936]: I0930 14:39:13.253860 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5z8f" Sep 30 14:39:13 crc kubenswrapper[4936]: I0930 14:39:13.739361 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5z8f"] Sep 30 14:39:14 crc kubenswrapper[4936]: I0930 14:39:14.343626 4936 generic.go:334] "Generic (PLEG): container finished" podID="58d41775-00b6-4416-bdea-cc975b46d701" containerID="687e743f38f1cd587adc9e09c2be7cde5ce66752691aad24575148503a716f61" exitCode=0 Sep 30 14:39:14 crc kubenswrapper[4936]: I0930 14:39:14.343672 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5z8f" event={"ID":"58d41775-00b6-4416-bdea-cc975b46d701","Type":"ContainerDied","Data":"687e743f38f1cd587adc9e09c2be7cde5ce66752691aad24575148503a716f61"} Sep 30 14:39:14 crc kubenswrapper[4936]: I0930 14:39:14.343701 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5z8f" event={"ID":"58d41775-00b6-4416-bdea-cc975b46d701","Type":"ContainerStarted","Data":"9c746b3bce07611b7657d34ebf854c3a827d3f4eff79d5a018c223fbc83d517b"} Sep 30 14:39:15 crc kubenswrapper[4936]: I0930 14:39:15.353947 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5z8f" event={"ID":"58d41775-00b6-4416-bdea-cc975b46d701","Type":"ContainerStarted","Data":"9e2a4abce893f2f14836771cde96024583610bd59f3a3f1082d7f5161963ec88"} Sep 30 14:39:16 crc kubenswrapper[4936]: I0930 14:39:16.364072 4936 generic.go:334] "Generic (PLEG): container finished" podID="58d41775-00b6-4416-bdea-cc975b46d701" containerID="9e2a4abce893f2f14836771cde96024583610bd59f3a3f1082d7f5161963ec88" exitCode=0 Sep 30 14:39:16 crc kubenswrapper[4936]: I0930 14:39:16.364150 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5z8f" event={"ID":"58d41775-00b6-4416-bdea-cc975b46d701","Type":"ContainerDied","Data":"9e2a4abce893f2f14836771cde96024583610bd59f3a3f1082d7f5161963ec88"} Sep 30 14:39:17 crc kubenswrapper[4936]: I0930 14:39:17.316273 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:39:17 crc kubenswrapper[4936]: E0930 14:39:17.316875 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:39:17 crc kubenswrapper[4936]: I0930 14:39:17.376291 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5z8f" event={"ID":"58d41775-00b6-4416-bdea-cc975b46d701","Type":"ContainerStarted","Data":"cf1a45b7d2bf8a6763cd2bd8afb56801b126f2ef4994d78ea7a33ff107e0e570"} Sep 30 14:39:17 crc kubenswrapper[4936]: I0930 14:39:17.401318 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h5z8f" podStartSLOduration=2.929489286 podStartE2EDuration="5.401297135s" podCreationTimestamp="2025-09-30 14:39:12 +0000 UTC" firstStartedPulling="2025-09-30 14:39:14.345496821 +0000 UTC m=+3604.729499122" lastFinishedPulling="2025-09-30 14:39:16.81730467 +0000 UTC m=+3607.201306971" observedRunningTime="2025-09-30 14:39:17.393981185 +0000 UTC m=+3607.777983506" watchObservedRunningTime="2025-09-30 14:39:17.401297135 +0000 UTC m=+3607.785299436" Sep 30 14:39:23 crc kubenswrapper[4936]: I0930 14:39:23.254751 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h5z8f" Sep 30 14:39:23 crc kubenswrapper[4936]: I0930 14:39:23.256104 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h5z8f" Sep 30 14:39:23 crc kubenswrapper[4936]: I0930 14:39:23.306730 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h5z8f" Sep 30 14:39:23 crc kubenswrapper[4936]: I0930 14:39:23.475591 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h5z8f" Sep 30 14:39:23 crc kubenswrapper[4936]: I0930 14:39:23.542824 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5z8f"] Sep 30 14:39:25 crc kubenswrapper[4936]: I0930 14:39:25.452202 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h5z8f" podUID="58d41775-00b6-4416-bdea-cc975b46d701" containerName="registry-server" containerID="cri-o://cf1a45b7d2bf8a6763cd2bd8afb56801b126f2ef4994d78ea7a33ff107e0e570" gracePeriod=2 Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.020630 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5z8f" Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.162584 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k72qj\" (UniqueName: \"kubernetes.io/projected/58d41775-00b6-4416-bdea-cc975b46d701-kube-api-access-k72qj\") pod \"58d41775-00b6-4416-bdea-cc975b46d701\" (UID: \"58d41775-00b6-4416-bdea-cc975b46d701\") " Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.162973 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58d41775-00b6-4416-bdea-cc975b46d701-utilities\") pod \"58d41775-00b6-4416-bdea-cc975b46d701\" (UID: \"58d41775-00b6-4416-bdea-cc975b46d701\") " Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.163041 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58d41775-00b6-4416-bdea-cc975b46d701-catalog-content\") pod \"58d41775-00b6-4416-bdea-cc975b46d701\" (UID: \"58d41775-00b6-4416-bdea-cc975b46d701\") " Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.163743 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58d41775-00b6-4416-bdea-cc975b46d701-utilities" (OuterVolumeSpecName: "utilities") pod "58d41775-00b6-4416-bdea-cc975b46d701" (UID: "58d41775-00b6-4416-bdea-cc975b46d701"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.173374 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58d41775-00b6-4416-bdea-cc975b46d701-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.180695 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58d41775-00b6-4416-bdea-cc975b46d701-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58d41775-00b6-4416-bdea-cc975b46d701" (UID: "58d41775-00b6-4416-bdea-cc975b46d701"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.181587 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58d41775-00b6-4416-bdea-cc975b46d701-kube-api-access-k72qj" (OuterVolumeSpecName: "kube-api-access-k72qj") pod "58d41775-00b6-4416-bdea-cc975b46d701" (UID: "58d41775-00b6-4416-bdea-cc975b46d701"). InnerVolumeSpecName "kube-api-access-k72qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.275016 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k72qj\" (UniqueName: \"kubernetes.io/projected/58d41775-00b6-4416-bdea-cc975b46d701-kube-api-access-k72qj\") on node \"crc\" DevicePath \"\"" Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.275053 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58d41775-00b6-4416-bdea-cc975b46d701-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.463410 4936 generic.go:334] "Generic (PLEG): container finished" podID="58d41775-00b6-4416-bdea-cc975b46d701" containerID="cf1a45b7d2bf8a6763cd2bd8afb56801b126f2ef4994d78ea7a33ff107e0e570" exitCode=0 Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.463450 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5z8f" event={"ID":"58d41775-00b6-4416-bdea-cc975b46d701","Type":"ContainerDied","Data":"cf1a45b7d2bf8a6763cd2bd8afb56801b126f2ef4994d78ea7a33ff107e0e570"} Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.463476 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5z8f" event={"ID":"58d41775-00b6-4416-bdea-cc975b46d701","Type":"ContainerDied","Data":"9c746b3bce07611b7657d34ebf854c3a827d3f4eff79d5a018c223fbc83d517b"} Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.463492 4936 scope.go:117] "RemoveContainer" containerID="cf1a45b7d2bf8a6763cd2bd8afb56801b126f2ef4994d78ea7a33ff107e0e570" Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.463537 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5z8f" Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.502977 4936 scope.go:117] "RemoveContainer" containerID="9e2a4abce893f2f14836771cde96024583610bd59f3a3f1082d7f5161963ec88" Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.507404 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5z8f"] Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.528600 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5z8f"] Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.568615 4936 scope.go:117] "RemoveContainer" containerID="687e743f38f1cd587adc9e09c2be7cde5ce66752691aad24575148503a716f61" Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.600432 4936 scope.go:117] "RemoveContainer" containerID="cf1a45b7d2bf8a6763cd2bd8afb56801b126f2ef4994d78ea7a33ff107e0e570" Sep 30 14:39:26 crc kubenswrapper[4936]: E0930 14:39:26.601022 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf1a45b7d2bf8a6763cd2bd8afb56801b126f2ef4994d78ea7a33ff107e0e570\": container with ID starting with cf1a45b7d2bf8a6763cd2bd8afb56801b126f2ef4994d78ea7a33ff107e0e570 not found: ID does not exist" containerID="cf1a45b7d2bf8a6763cd2bd8afb56801b126f2ef4994d78ea7a33ff107e0e570" Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.601093 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf1a45b7d2bf8a6763cd2bd8afb56801b126f2ef4994d78ea7a33ff107e0e570"} err="failed to get container status \"cf1a45b7d2bf8a6763cd2bd8afb56801b126f2ef4994d78ea7a33ff107e0e570\": rpc error: code = NotFound desc = could not find container \"cf1a45b7d2bf8a6763cd2bd8afb56801b126f2ef4994d78ea7a33ff107e0e570\": container with ID starting with cf1a45b7d2bf8a6763cd2bd8afb56801b126f2ef4994d78ea7a33ff107e0e570 not found: ID does not exist" Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.601124 4936 scope.go:117] "RemoveContainer" containerID="9e2a4abce893f2f14836771cde96024583610bd59f3a3f1082d7f5161963ec88" Sep 30 14:39:26 crc kubenswrapper[4936]: E0930 14:39:26.601451 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e2a4abce893f2f14836771cde96024583610bd59f3a3f1082d7f5161963ec88\": container with ID starting with 9e2a4abce893f2f14836771cde96024583610bd59f3a3f1082d7f5161963ec88 not found: ID does not exist" containerID="9e2a4abce893f2f14836771cde96024583610bd59f3a3f1082d7f5161963ec88" Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.601555 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e2a4abce893f2f14836771cde96024583610bd59f3a3f1082d7f5161963ec88"} err="failed to get container status \"9e2a4abce893f2f14836771cde96024583610bd59f3a3f1082d7f5161963ec88\": rpc error: code = NotFound desc = could not find container \"9e2a4abce893f2f14836771cde96024583610bd59f3a3f1082d7f5161963ec88\": container with ID starting with 9e2a4abce893f2f14836771cde96024583610bd59f3a3f1082d7f5161963ec88 not found: ID does not exist" Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.601680 4936 scope.go:117] "RemoveContainer" containerID="687e743f38f1cd587adc9e09c2be7cde5ce66752691aad24575148503a716f61" Sep 30 14:39:26 crc kubenswrapper[4936]: E0930 14:39:26.602605 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"687e743f38f1cd587adc9e09c2be7cde5ce66752691aad24575148503a716f61\": container with ID starting with 687e743f38f1cd587adc9e09c2be7cde5ce66752691aad24575148503a716f61 not found: ID does not exist" containerID="687e743f38f1cd587adc9e09c2be7cde5ce66752691aad24575148503a716f61" Sep 30 14:39:26 crc kubenswrapper[4936]: I0930 14:39:26.602727 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"687e743f38f1cd587adc9e09c2be7cde5ce66752691aad24575148503a716f61"} err="failed to get container status \"687e743f38f1cd587adc9e09c2be7cde5ce66752691aad24575148503a716f61\": rpc error: code = NotFound desc = could not find container \"687e743f38f1cd587adc9e09c2be7cde5ce66752691aad24575148503a716f61\": container with ID starting with 687e743f38f1cd587adc9e09c2be7cde5ce66752691aad24575148503a716f61 not found: ID does not exist" Sep 30 14:39:28 crc kubenswrapper[4936]: I0930 14:39:28.315031 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:39:28 crc kubenswrapper[4936]: E0930 14:39:28.315880 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:39:28 crc kubenswrapper[4936]: I0930 14:39:28.325669 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58d41775-00b6-4416-bdea-cc975b46d701" path="/var/lib/kubelet/pods/58d41775-00b6-4416-bdea-cc975b46d701/volumes" Sep 30 14:39:42 crc kubenswrapper[4936]: I0930 14:39:42.319124 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:39:42 crc kubenswrapper[4936]: E0930 14:39:42.320806 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:39:55 crc kubenswrapper[4936]: I0930 14:39:55.315810 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:39:55 crc kubenswrapper[4936]: E0930 14:39:55.316658 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:40:09 crc kubenswrapper[4936]: I0930 14:40:09.315912 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:40:09 crc kubenswrapper[4936]: E0930 14:40:09.316816 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:40:20 crc kubenswrapper[4936]: I0930 14:40:20.321388 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:40:20 crc kubenswrapper[4936]: E0930 14:40:20.322989 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:40:35 crc kubenswrapper[4936]: I0930 14:40:35.315738 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:40:35 crc kubenswrapper[4936]: E0930 14:40:35.316518 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:40:50 crc kubenswrapper[4936]: I0930 14:40:50.325461 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:40:50 crc kubenswrapper[4936]: E0930 14:40:50.326258 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:41:01 crc kubenswrapper[4936]: I0930 14:41:01.315613 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:41:01 crc kubenswrapper[4936]: E0930 14:41:01.316437 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:41:15 crc kubenswrapper[4936]: I0930 14:41:15.315677 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:41:15 crc kubenswrapper[4936]: E0930 14:41:15.316392 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:41:26 crc kubenswrapper[4936]: I0930 14:41:26.315606 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:41:26 crc kubenswrapper[4936]: E0930 14:41:26.316307 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:41:37 crc kubenswrapper[4936]: I0930 14:41:37.315789 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:41:37 crc kubenswrapper[4936]: E0930 14:41:37.316684 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:41:50 crc kubenswrapper[4936]: I0930 14:41:50.325678 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:41:50 crc kubenswrapper[4936]: E0930 14:41:50.326466 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:42:05 crc kubenswrapper[4936]: I0930 14:42:05.317318 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:42:05 crc kubenswrapper[4936]: E0930 14:42:05.318037 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:42:18 crc kubenswrapper[4936]: I0930 14:42:18.315476 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:42:18 crc kubenswrapper[4936]: E0930 14:42:18.316482 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:42:31 crc kubenswrapper[4936]: I0930 14:42:31.315657 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:42:31 crc kubenswrapper[4936]: E0930 14:42:31.316463 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:42:43 crc kubenswrapper[4936]: I0930 14:42:43.315388 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:42:43 crc kubenswrapper[4936]: E0930 14:42:43.317617 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:42:57 crc kubenswrapper[4936]: I0930 14:42:57.316554 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:42:57 crc kubenswrapper[4936]: E0930 14:42:57.317257 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:43:10 crc kubenswrapper[4936]: I0930 14:43:10.323705 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:43:10 crc kubenswrapper[4936]: E0930 14:43:10.325097 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:43:22 crc kubenswrapper[4936]: I0930 14:43:22.320154 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:43:22 crc kubenswrapper[4936]: E0930 14:43:22.321253 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:43:34 crc kubenswrapper[4936]: I0930 14:43:34.315450 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:43:34 crc kubenswrapper[4936]: E0930 14:43:34.316192 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:43:45 crc kubenswrapper[4936]: I0930 14:43:45.316260 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:43:45 crc kubenswrapper[4936]: E0930 14:43:45.317968 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:43:57 crc kubenswrapper[4936]: I0930 14:43:57.315232 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:43:57 crc kubenswrapper[4936]: I0930 14:43:57.376996 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-srgdh"] Sep 30 14:43:57 crc kubenswrapper[4936]: E0930 14:43:57.377711 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d41775-00b6-4416-bdea-cc975b46d701" containerName="registry-server" Sep 30 14:43:57 crc kubenswrapper[4936]: I0930 14:43:57.377811 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d41775-00b6-4416-bdea-cc975b46d701" containerName="registry-server" Sep 30 14:43:57 crc kubenswrapper[4936]: E0930 14:43:57.377892 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d41775-00b6-4416-bdea-cc975b46d701" containerName="extract-utilities" Sep 30 14:43:57 crc kubenswrapper[4936]: I0930 14:43:57.377949 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d41775-00b6-4416-bdea-cc975b46d701" containerName="extract-utilities" Sep 30 14:43:57 crc kubenswrapper[4936]: E0930 14:43:57.378006 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d41775-00b6-4416-bdea-cc975b46d701" containerName="extract-content" Sep 30 14:43:57 crc kubenswrapper[4936]: I0930 14:43:57.378061 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d41775-00b6-4416-bdea-cc975b46d701" containerName="extract-content" Sep 30 14:43:57 crc kubenswrapper[4936]: I0930 14:43:57.378315 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="58d41775-00b6-4416-bdea-cc975b46d701" containerName="registry-server" Sep 30 14:43:57 crc kubenswrapper[4936]: I0930 14:43:57.380129 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srgdh" Sep 30 14:43:57 crc kubenswrapper[4936]: I0930 14:43:57.433996 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-srgdh"] Sep 30 14:43:57 crc kubenswrapper[4936]: I0930 14:43:57.475449 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkpkp\" (UniqueName: \"kubernetes.io/projected/41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6-kube-api-access-vkpkp\") pod \"community-operators-srgdh\" (UID: \"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6\") " pod="openshift-marketplace/community-operators-srgdh" Sep 30 14:43:57 crc kubenswrapper[4936]: I0930 14:43:57.475584 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6-utilities\") pod \"community-operators-srgdh\" (UID: \"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6\") " pod="openshift-marketplace/community-operators-srgdh" Sep 30 14:43:57 crc kubenswrapper[4936]: I0930 14:43:57.475675 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6-catalog-content\") pod \"community-operators-srgdh\" (UID: \"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6\") " pod="openshift-marketplace/community-operators-srgdh" Sep 30 14:43:57 crc kubenswrapper[4936]: I0930 14:43:57.578221 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6-utilities\") pod \"community-operators-srgdh\" (UID: \"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6\") " pod="openshift-marketplace/community-operators-srgdh" Sep 30 14:43:57 crc kubenswrapper[4936]: I0930 14:43:57.578707 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6-catalog-content\") pod \"community-operators-srgdh\" (UID: \"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6\") " pod="openshift-marketplace/community-operators-srgdh" Sep 30 14:43:57 crc kubenswrapper[4936]: I0930 14:43:57.578788 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkpkp\" (UniqueName: \"kubernetes.io/projected/41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6-kube-api-access-vkpkp\") pod \"community-operators-srgdh\" (UID: \"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6\") " pod="openshift-marketplace/community-operators-srgdh" Sep 30 14:43:57 crc kubenswrapper[4936]: I0930 14:43:57.578849 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6-utilities\") pod \"community-operators-srgdh\" (UID: \"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6\") " pod="openshift-marketplace/community-operators-srgdh" Sep 30 14:43:57 crc kubenswrapper[4936]: I0930 14:43:57.579103 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6-catalog-content\") pod \"community-operators-srgdh\" (UID: \"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6\") " pod="openshift-marketplace/community-operators-srgdh" Sep 30 14:43:57 crc kubenswrapper[4936]: I0930 14:43:57.612146 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkpkp\" (UniqueName: \"kubernetes.io/projected/41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6-kube-api-access-vkpkp\") pod \"community-operators-srgdh\" (UID: \"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6\") " pod="openshift-marketplace/community-operators-srgdh" Sep 30 14:43:57 crc kubenswrapper[4936]: I0930 14:43:57.775295 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srgdh" Sep 30 14:43:57 crc kubenswrapper[4936]: I0930 14:43:57.874817 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"60d0f6199ae64fedc8219dfc1bb7d66559a4c14d7fda332b82699bce909148d9"} Sep 30 14:43:58 crc kubenswrapper[4936]: I0930 14:43:58.580803 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-srgdh"] Sep 30 14:43:58 crc kubenswrapper[4936]: I0930 14:43:58.884195 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srgdh" event={"ID":"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6","Type":"ContainerStarted","Data":"6383a9360a5e003a921a7f12c16df97676ed8d833f6e75ab64699aa3215c0c1f"} Sep 30 14:43:58 crc kubenswrapper[4936]: I0930 14:43:58.885080 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srgdh" event={"ID":"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6","Type":"ContainerStarted","Data":"40b18b5377dd47550ee9996d62fa64ebd4eb4c3d122aa70e8c2f9e50c942559e"} Sep 30 14:43:59 crc kubenswrapper[4936]: I0930 14:43:59.894325 4936 generic.go:334] "Generic (PLEG): container finished" podID="41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6" containerID="6383a9360a5e003a921a7f12c16df97676ed8d833f6e75ab64699aa3215c0c1f" exitCode=0 Sep 30 14:43:59 crc kubenswrapper[4936]: I0930 14:43:59.894477 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srgdh" event={"ID":"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6","Type":"ContainerDied","Data":"6383a9360a5e003a921a7f12c16df97676ed8d833f6e75ab64699aa3215c0c1f"} Sep 30 14:44:00 crc kubenswrapper[4936]: I0930 14:44:00.904798 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srgdh" event={"ID":"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6","Type":"ContainerStarted","Data":"969f51c5e69854b8061841bf9d6ac570fb5dcfe2555364c80a262d22457a4c9b"} Sep 30 14:44:02 crc kubenswrapper[4936]: I0930 14:44:02.926965 4936 generic.go:334] "Generic (PLEG): container finished" podID="41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6" containerID="969f51c5e69854b8061841bf9d6ac570fb5dcfe2555364c80a262d22457a4c9b" exitCode=0 Sep 30 14:44:02 crc kubenswrapper[4936]: I0930 14:44:02.927481 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srgdh" event={"ID":"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6","Type":"ContainerDied","Data":"969f51c5e69854b8061841bf9d6ac570fb5dcfe2555364c80a262d22457a4c9b"} Sep 30 14:44:03 crc kubenswrapper[4936]: I0930 14:44:03.940486 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srgdh" event={"ID":"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6","Type":"ContainerStarted","Data":"7bb5cdd95f6ec7584d3fe11603d81801da1bf1b7365aa5b21b23e37788e90d0c"} Sep 30 14:44:03 crc kubenswrapper[4936]: I0930 14:44:03.979024 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-srgdh" podStartSLOduration=3.400339293 podStartE2EDuration="6.978995724s" podCreationTimestamp="2025-09-30 14:43:57 +0000 UTC" firstStartedPulling="2025-09-30 14:43:59.898522875 +0000 UTC m=+3890.282525176" lastFinishedPulling="2025-09-30 14:44:03.477179306 +0000 UTC m=+3893.861181607" observedRunningTime="2025-09-30 14:44:03.975520929 +0000 UTC m=+3894.359523230" watchObservedRunningTime="2025-09-30 14:44:03.978995724 +0000 UTC m=+3894.362998025" Sep 30 14:44:07 crc kubenswrapper[4936]: I0930 14:44:07.776352 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-srgdh" Sep 30 14:44:07 crc kubenswrapper[4936]: I0930 14:44:07.777365 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-srgdh" Sep 30 14:44:08 crc kubenswrapper[4936]: I0930 14:44:08.844027 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-srgdh" podUID="41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6" containerName="registry-server" probeResult="failure" output=< Sep 30 14:44:08 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 14:44:08 crc kubenswrapper[4936]: > Sep 30 14:44:17 crc kubenswrapper[4936]: I0930 14:44:17.824445 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-srgdh" Sep 30 14:44:17 crc kubenswrapper[4936]: I0930 14:44:17.882089 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-srgdh" Sep 30 14:44:18 crc kubenswrapper[4936]: I0930 14:44:18.068327 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-srgdh"] Sep 30 14:44:19 crc kubenswrapper[4936]: I0930 14:44:19.096189 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-srgdh" podUID="41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6" containerName="registry-server" containerID="cri-o://7bb5cdd95f6ec7584d3fe11603d81801da1bf1b7365aa5b21b23e37788e90d0c" gracePeriod=2 Sep 30 14:44:19 crc kubenswrapper[4936]: I0930 14:44:19.686060 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srgdh" Sep 30 14:44:19 crc kubenswrapper[4936]: I0930 14:44:19.868518 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6-catalog-content\") pod \"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6\" (UID: \"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6\") " Sep 30 14:44:19 crc kubenswrapper[4936]: I0930 14:44:19.871755 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkpkp\" (UniqueName: \"kubernetes.io/projected/41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6-kube-api-access-vkpkp\") pod \"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6\" (UID: \"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6\") " Sep 30 14:44:19 crc kubenswrapper[4936]: I0930 14:44:19.871904 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6-utilities\") pod \"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6\" (UID: \"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6\") " Sep 30 14:44:19 crc kubenswrapper[4936]: I0930 14:44:19.872548 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6-utilities" (OuterVolumeSpecName: "utilities") pod "41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6" (UID: "41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:44:19 crc kubenswrapper[4936]: I0930 14:44:19.884027 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6-kube-api-access-vkpkp" (OuterVolumeSpecName: "kube-api-access-vkpkp") pod "41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6" (UID: "41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6"). InnerVolumeSpecName "kube-api-access-vkpkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:44:19 crc kubenswrapper[4936]: I0930 14:44:19.923864 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6" (UID: "41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:44:19 crc kubenswrapper[4936]: I0930 14:44:19.974196 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:44:19 crc kubenswrapper[4936]: I0930 14:44:19.974232 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:44:19 crc kubenswrapper[4936]: I0930 14:44:19.974247 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkpkp\" (UniqueName: \"kubernetes.io/projected/41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6-kube-api-access-vkpkp\") on node \"crc\" DevicePath \"\"" Sep 30 14:44:20 crc kubenswrapper[4936]: I0930 14:44:20.107248 4936 generic.go:334] "Generic (PLEG): container finished" podID="41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6" containerID="7bb5cdd95f6ec7584d3fe11603d81801da1bf1b7365aa5b21b23e37788e90d0c" exitCode=0 Sep 30 14:44:20 crc kubenswrapper[4936]: I0930 14:44:20.107300 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srgdh" event={"ID":"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6","Type":"ContainerDied","Data":"7bb5cdd95f6ec7584d3fe11603d81801da1bf1b7365aa5b21b23e37788e90d0c"} Sep 30 14:44:20 crc kubenswrapper[4936]: I0930 14:44:20.107350 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srgdh" Sep 30 14:44:20 crc kubenswrapper[4936]: I0930 14:44:20.107376 4936 scope.go:117] "RemoveContainer" containerID="7bb5cdd95f6ec7584d3fe11603d81801da1bf1b7365aa5b21b23e37788e90d0c" Sep 30 14:44:20 crc kubenswrapper[4936]: I0930 14:44:20.107391 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srgdh" event={"ID":"41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6","Type":"ContainerDied","Data":"40b18b5377dd47550ee9996d62fa64ebd4eb4c3d122aa70e8c2f9e50c942559e"} Sep 30 14:44:20 crc kubenswrapper[4936]: I0930 14:44:20.132266 4936 scope.go:117] "RemoveContainer" containerID="969f51c5e69854b8061841bf9d6ac570fb5dcfe2555364c80a262d22457a4c9b" Sep 30 14:44:20 crc kubenswrapper[4936]: I0930 14:44:20.155048 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-srgdh"] Sep 30 14:44:20 crc kubenswrapper[4936]: I0930 14:44:20.166748 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-srgdh"] Sep 30 14:44:20 crc kubenswrapper[4936]: I0930 14:44:20.169229 4936 scope.go:117] "RemoveContainer" containerID="6383a9360a5e003a921a7f12c16df97676ed8d833f6e75ab64699aa3215c0c1f" Sep 30 14:44:20 crc kubenswrapper[4936]: I0930 14:44:20.212326 4936 scope.go:117] "RemoveContainer" containerID="7bb5cdd95f6ec7584d3fe11603d81801da1bf1b7365aa5b21b23e37788e90d0c" Sep 30 14:44:20 crc kubenswrapper[4936]: E0930 14:44:20.212932 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bb5cdd95f6ec7584d3fe11603d81801da1bf1b7365aa5b21b23e37788e90d0c\": container with ID starting with 7bb5cdd95f6ec7584d3fe11603d81801da1bf1b7365aa5b21b23e37788e90d0c not found: ID does not exist" containerID="7bb5cdd95f6ec7584d3fe11603d81801da1bf1b7365aa5b21b23e37788e90d0c" Sep 30 14:44:20 crc kubenswrapper[4936]: I0930 14:44:20.212982 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bb5cdd95f6ec7584d3fe11603d81801da1bf1b7365aa5b21b23e37788e90d0c"} err="failed to get container status \"7bb5cdd95f6ec7584d3fe11603d81801da1bf1b7365aa5b21b23e37788e90d0c\": rpc error: code = NotFound desc = could not find container \"7bb5cdd95f6ec7584d3fe11603d81801da1bf1b7365aa5b21b23e37788e90d0c\": container with ID starting with 7bb5cdd95f6ec7584d3fe11603d81801da1bf1b7365aa5b21b23e37788e90d0c not found: ID does not exist" Sep 30 14:44:20 crc kubenswrapper[4936]: I0930 14:44:20.213010 4936 scope.go:117] "RemoveContainer" containerID="969f51c5e69854b8061841bf9d6ac570fb5dcfe2555364c80a262d22457a4c9b" Sep 30 14:44:20 crc kubenswrapper[4936]: E0930 14:44:20.213559 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"969f51c5e69854b8061841bf9d6ac570fb5dcfe2555364c80a262d22457a4c9b\": container with ID starting with 969f51c5e69854b8061841bf9d6ac570fb5dcfe2555364c80a262d22457a4c9b not found: ID does not exist" containerID="969f51c5e69854b8061841bf9d6ac570fb5dcfe2555364c80a262d22457a4c9b" Sep 30 14:44:20 crc kubenswrapper[4936]: I0930 14:44:20.213584 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"969f51c5e69854b8061841bf9d6ac570fb5dcfe2555364c80a262d22457a4c9b"} err="failed to get container status \"969f51c5e69854b8061841bf9d6ac570fb5dcfe2555364c80a262d22457a4c9b\": rpc error: code = NotFound desc = could not find container \"969f51c5e69854b8061841bf9d6ac570fb5dcfe2555364c80a262d22457a4c9b\": container with ID starting with 969f51c5e69854b8061841bf9d6ac570fb5dcfe2555364c80a262d22457a4c9b not found: ID does not exist" Sep 30 14:44:20 crc kubenswrapper[4936]: I0930 14:44:20.213599 4936 scope.go:117] "RemoveContainer" containerID="6383a9360a5e003a921a7f12c16df97676ed8d833f6e75ab64699aa3215c0c1f" Sep 30 14:44:20 crc kubenswrapper[4936]: E0930 14:44:20.214030 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6383a9360a5e003a921a7f12c16df97676ed8d833f6e75ab64699aa3215c0c1f\": container with ID starting with 6383a9360a5e003a921a7f12c16df97676ed8d833f6e75ab64699aa3215c0c1f not found: ID does not exist" containerID="6383a9360a5e003a921a7f12c16df97676ed8d833f6e75ab64699aa3215c0c1f" Sep 30 14:44:20 crc kubenswrapper[4936]: I0930 14:44:20.214101 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6383a9360a5e003a921a7f12c16df97676ed8d833f6e75ab64699aa3215c0c1f"} err="failed to get container status \"6383a9360a5e003a921a7f12c16df97676ed8d833f6e75ab64699aa3215c0c1f\": rpc error: code = NotFound desc = could not find container \"6383a9360a5e003a921a7f12c16df97676ed8d833f6e75ab64699aa3215c0c1f\": container with ID starting with 6383a9360a5e003a921a7f12c16df97676ed8d833f6e75ab64699aa3215c0c1f not found: ID does not exist" Sep 30 14:44:20 crc kubenswrapper[4936]: I0930 14:44:20.326788 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6" path="/var/lib/kubelet/pods/41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6/volumes" Sep 30 14:44:28 crc kubenswrapper[4936]: I0930 14:44:28.076561 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-vnnf4"] Sep 30 14:44:28 crc kubenswrapper[4936]: I0930 14:44:28.087327 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-vnnf4"] Sep 30 14:44:28 crc kubenswrapper[4936]: I0930 14:44:28.326470 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9100f851-91ef-4d1d-8346-047e97aef7c2" path="/var/lib/kubelet/pods/9100f851-91ef-4d1d-8346-047e97aef7c2/volumes" Sep 30 14:44:41 crc kubenswrapper[4936]: I0930 14:44:41.504074 4936 scope.go:117] "RemoveContainer" containerID="e7b4eb175e63344b1d89123bd36c37a92840469768c64cefe4a8974b7ea807de" Sep 30 14:44:47 crc kubenswrapper[4936]: I0930 14:44:47.034069 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-3a0a-account-create-ldfjs"] Sep 30 14:44:47 crc kubenswrapper[4936]: I0930 14:44:47.046096 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-3a0a-account-create-ldfjs"] Sep 30 14:44:48 crc kubenswrapper[4936]: I0930 14:44:48.332710 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc1a270c-bc12-4d84-9482-b51a8db4be0d" path="/var/lib/kubelet/pods/bc1a270c-bc12-4d84-9482-b51a8db4be0d/volumes" Sep 30 14:45:00 crc kubenswrapper[4936]: I0930 14:45:00.150050 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx"] Sep 30 14:45:00 crc kubenswrapper[4936]: E0930 14:45:00.151065 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6" containerName="extract-utilities" Sep 30 14:45:00 crc kubenswrapper[4936]: I0930 14:45:00.151082 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6" containerName="extract-utilities" Sep 30 14:45:00 crc kubenswrapper[4936]: E0930 14:45:00.151105 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6" containerName="extract-content" Sep 30 14:45:00 crc kubenswrapper[4936]: I0930 14:45:00.151112 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6" containerName="extract-content" Sep 30 14:45:00 crc kubenswrapper[4936]: E0930 14:45:00.151129 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6" containerName="registry-server" Sep 30 14:45:00 crc kubenswrapper[4936]: I0930 14:45:00.151136 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6" containerName="registry-server" Sep 30 14:45:00 crc kubenswrapper[4936]: I0930 14:45:00.151370 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="41d786a3-ea4a-4d4f-98df-a0cc8a69b3e6" containerName="registry-server" Sep 30 14:45:00 crc kubenswrapper[4936]: I0930 14:45:00.152779 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx" Sep 30 14:45:00 crc kubenswrapper[4936]: I0930 14:45:00.158679 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 14:45:00 crc kubenswrapper[4936]: I0930 14:45:00.158720 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 14:45:00 crc kubenswrapper[4936]: I0930 14:45:00.173363 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx"] Sep 30 14:45:00 crc kubenswrapper[4936]: I0930 14:45:00.281029 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prd9f\" (UniqueName: \"kubernetes.io/projected/b53cee44-57ae-47e3-9ac6-e60cd92e9092-kube-api-access-prd9f\") pod \"collect-profiles-29320725-t8csx\" (UID: \"b53cee44-57ae-47e3-9ac6-e60cd92e9092\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx" Sep 30 14:45:00 crc kubenswrapper[4936]: I0930 14:45:00.281632 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b53cee44-57ae-47e3-9ac6-e60cd92e9092-config-volume\") pod \"collect-profiles-29320725-t8csx\" (UID: \"b53cee44-57ae-47e3-9ac6-e60cd92e9092\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx" Sep 30 14:45:00 crc kubenswrapper[4936]: I0930 14:45:00.281701 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b53cee44-57ae-47e3-9ac6-e60cd92e9092-secret-volume\") pod \"collect-profiles-29320725-t8csx\" (UID: \"b53cee44-57ae-47e3-9ac6-e60cd92e9092\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx" Sep 30 14:45:00 crc kubenswrapper[4936]: I0930 14:45:00.384824 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prd9f\" (UniqueName: \"kubernetes.io/projected/b53cee44-57ae-47e3-9ac6-e60cd92e9092-kube-api-access-prd9f\") pod \"collect-profiles-29320725-t8csx\" (UID: \"b53cee44-57ae-47e3-9ac6-e60cd92e9092\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx" Sep 30 14:45:00 crc kubenswrapper[4936]: I0930 14:45:00.385214 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b53cee44-57ae-47e3-9ac6-e60cd92e9092-config-volume\") pod \"collect-profiles-29320725-t8csx\" (UID: \"b53cee44-57ae-47e3-9ac6-e60cd92e9092\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx" Sep 30 14:45:00 crc kubenswrapper[4936]: I0930 14:45:00.385299 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b53cee44-57ae-47e3-9ac6-e60cd92e9092-secret-volume\") pod \"collect-profiles-29320725-t8csx\" (UID: \"b53cee44-57ae-47e3-9ac6-e60cd92e9092\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx" Sep 30 14:45:00 crc kubenswrapper[4936]: I0930 14:45:00.387313 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b53cee44-57ae-47e3-9ac6-e60cd92e9092-config-volume\") pod \"collect-profiles-29320725-t8csx\" (UID: \"b53cee44-57ae-47e3-9ac6-e60cd92e9092\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx" Sep 30 14:45:00 crc kubenswrapper[4936]: I0930 14:45:00.392126 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b53cee44-57ae-47e3-9ac6-e60cd92e9092-secret-volume\") pod \"collect-profiles-29320725-t8csx\" (UID: \"b53cee44-57ae-47e3-9ac6-e60cd92e9092\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx" Sep 30 14:45:00 crc kubenswrapper[4936]: I0930 14:45:00.407432 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prd9f\" (UniqueName: \"kubernetes.io/projected/b53cee44-57ae-47e3-9ac6-e60cd92e9092-kube-api-access-prd9f\") pod \"collect-profiles-29320725-t8csx\" (UID: \"b53cee44-57ae-47e3-9ac6-e60cd92e9092\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx" Sep 30 14:45:00 crc kubenswrapper[4936]: I0930 14:45:00.478484 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx" Sep 30 14:45:00 crc kubenswrapper[4936]: I0930 14:45:00.981418 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx"] Sep 30 14:45:01 crc kubenswrapper[4936]: I0930 14:45:01.526587 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx" event={"ID":"b53cee44-57ae-47e3-9ac6-e60cd92e9092","Type":"ContainerStarted","Data":"40a4b291bec7da020d81bbce243b79dd0367649c9c864428679f5eddad0125bb"} Sep 30 14:45:01 crc kubenswrapper[4936]: I0930 14:45:01.527001 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx" event={"ID":"b53cee44-57ae-47e3-9ac6-e60cd92e9092","Type":"ContainerStarted","Data":"f44b271e7cf8c01c236d7d6cbe744f7a6433fa38e9a81bcbbc76716964c55c10"} Sep 30 14:45:01 crc kubenswrapper[4936]: I0930 14:45:01.545358 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx" podStartSLOduration=1.545318214 podStartE2EDuration="1.545318214s" podCreationTimestamp="2025-09-30 14:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:45:01.543213607 +0000 UTC m=+3951.927215918" watchObservedRunningTime="2025-09-30 14:45:01.545318214 +0000 UTC m=+3951.929320515" Sep 30 14:45:02 crc kubenswrapper[4936]: I0930 14:45:02.543109 4936 generic.go:334] "Generic (PLEG): container finished" podID="b53cee44-57ae-47e3-9ac6-e60cd92e9092" containerID="40a4b291bec7da020d81bbce243b79dd0367649c9c864428679f5eddad0125bb" exitCode=0 Sep 30 14:45:02 crc kubenswrapper[4936]: I0930 14:45:02.543286 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx" event={"ID":"b53cee44-57ae-47e3-9ac6-e60cd92e9092","Type":"ContainerDied","Data":"40a4b291bec7da020d81bbce243b79dd0367649c9c864428679f5eddad0125bb"} Sep 30 14:45:04 crc kubenswrapper[4936]: I0930 14:45:04.019836 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx" Sep 30 14:45:04 crc kubenswrapper[4936]: I0930 14:45:04.094313 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b53cee44-57ae-47e3-9ac6-e60cd92e9092-secret-volume\") pod \"b53cee44-57ae-47e3-9ac6-e60cd92e9092\" (UID: \"b53cee44-57ae-47e3-9ac6-e60cd92e9092\") " Sep 30 14:45:04 crc kubenswrapper[4936]: I0930 14:45:04.094602 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prd9f\" (UniqueName: \"kubernetes.io/projected/b53cee44-57ae-47e3-9ac6-e60cd92e9092-kube-api-access-prd9f\") pod \"b53cee44-57ae-47e3-9ac6-e60cd92e9092\" (UID: \"b53cee44-57ae-47e3-9ac6-e60cd92e9092\") " Sep 30 14:45:04 crc kubenswrapper[4936]: I0930 14:45:04.095496 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b53cee44-57ae-47e3-9ac6-e60cd92e9092-config-volume\") pod \"b53cee44-57ae-47e3-9ac6-e60cd92e9092\" (UID: \"b53cee44-57ae-47e3-9ac6-e60cd92e9092\") " Sep 30 14:45:04 crc kubenswrapper[4936]: I0930 14:45:04.096378 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b53cee44-57ae-47e3-9ac6-e60cd92e9092-config-volume" (OuterVolumeSpecName: "config-volume") pod "b53cee44-57ae-47e3-9ac6-e60cd92e9092" (UID: "b53cee44-57ae-47e3-9ac6-e60cd92e9092"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:45:04 crc kubenswrapper[4936]: I0930 14:45:04.097217 4936 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b53cee44-57ae-47e3-9ac6-e60cd92e9092-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:45:04 crc kubenswrapper[4936]: I0930 14:45:04.101437 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b53cee44-57ae-47e3-9ac6-e60cd92e9092-kube-api-access-prd9f" (OuterVolumeSpecName: "kube-api-access-prd9f") pod "b53cee44-57ae-47e3-9ac6-e60cd92e9092" (UID: "b53cee44-57ae-47e3-9ac6-e60cd92e9092"). InnerVolumeSpecName "kube-api-access-prd9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:45:04 crc kubenswrapper[4936]: I0930 14:45:04.101563 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b53cee44-57ae-47e3-9ac6-e60cd92e9092-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b53cee44-57ae-47e3-9ac6-e60cd92e9092" (UID: "b53cee44-57ae-47e3-9ac6-e60cd92e9092"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:45:04 crc kubenswrapper[4936]: I0930 14:45:04.199076 4936 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b53cee44-57ae-47e3-9ac6-e60cd92e9092-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:45:04 crc kubenswrapper[4936]: I0930 14:45:04.199109 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prd9f\" (UniqueName: \"kubernetes.io/projected/b53cee44-57ae-47e3-9ac6-e60cd92e9092-kube-api-access-prd9f\") on node \"crc\" DevicePath \"\"" Sep 30 14:45:04 crc kubenswrapper[4936]: I0930 14:45:04.565055 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx" event={"ID":"b53cee44-57ae-47e3-9ac6-e60cd92e9092","Type":"ContainerDied","Data":"f44b271e7cf8c01c236d7d6cbe744f7a6433fa38e9a81bcbbc76716964c55c10"} Sep 30 14:45:04 crc kubenswrapper[4936]: I0930 14:45:04.565391 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f44b271e7cf8c01c236d7d6cbe744f7a6433fa38e9a81bcbbc76716964c55c10" Sep 30 14:45:04 crc kubenswrapper[4936]: I0930 14:45:04.565459 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-t8csx" Sep 30 14:45:04 crc kubenswrapper[4936]: I0930 14:45:04.626165 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth"] Sep 30 14:45:04 crc kubenswrapper[4936]: I0930 14:45:04.634314 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320680-wcwth"] Sep 30 14:45:06 crc kubenswrapper[4936]: I0930 14:45:06.331220 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9758fa00-0827-4d33-838d-9b9ab9703548" path="/var/lib/kubelet/pods/9758fa00-0827-4d33-838d-9b9ab9703548/volumes" Sep 30 14:45:11 crc kubenswrapper[4936]: I0930 14:45:11.042451 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-2cqnc"] Sep 30 14:45:11 crc kubenswrapper[4936]: I0930 14:45:11.044819 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-2cqnc"] Sep 30 14:45:12 crc kubenswrapper[4936]: I0930 14:45:12.333126 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dc73413-5641-4845-a637-2430806cfa00" path="/var/lib/kubelet/pods/3dc73413-5641-4845-a637-2430806cfa00/volumes" Sep 30 14:45:41 crc kubenswrapper[4936]: I0930 14:45:41.633266 4936 scope.go:117] "RemoveContainer" containerID="461fa2fc5e8c4739240d09d131792e4d0f74a72a1c6d56eb302702d6d8dd4214" Sep 30 14:45:41 crc kubenswrapper[4936]: I0930 14:45:41.661792 4936 scope.go:117] "RemoveContainer" containerID="b5823dbb08cfd4ccd812e5f6f8004fa251be6f9b8f0123737355d7f0adc4ec58" Sep 30 14:45:41 crc kubenswrapper[4936]: I0930 14:45:41.719663 4936 scope.go:117] "RemoveContainer" containerID="7ebc22322b653d1bf6e19c441532fa1d415aac44347dba3603b526279c501a8c" Sep 30 14:45:46 crc kubenswrapper[4936]: I0930 14:45:46.466498 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l74zs"] Sep 30 14:45:46 crc kubenswrapper[4936]: E0930 14:45:46.467495 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b53cee44-57ae-47e3-9ac6-e60cd92e9092" containerName="collect-profiles" Sep 30 14:45:46 crc kubenswrapper[4936]: I0930 14:45:46.467515 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b53cee44-57ae-47e3-9ac6-e60cd92e9092" containerName="collect-profiles" Sep 30 14:45:46 crc kubenswrapper[4936]: I0930 14:45:46.467730 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b53cee44-57ae-47e3-9ac6-e60cd92e9092" containerName="collect-profiles" Sep 30 14:45:46 crc kubenswrapper[4936]: I0930 14:45:46.469186 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l74zs" Sep 30 14:45:46 crc kubenswrapper[4936]: I0930 14:45:46.487242 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l74zs"] Sep 30 14:45:46 crc kubenswrapper[4936]: I0930 14:45:46.567030 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5wwp\" (UniqueName: \"kubernetes.io/projected/8034904a-14d8-4c91-b55d-98b7c824d8de-kube-api-access-m5wwp\") pod \"certified-operators-l74zs\" (UID: \"8034904a-14d8-4c91-b55d-98b7c824d8de\") " pod="openshift-marketplace/certified-operators-l74zs" Sep 30 14:45:46 crc kubenswrapper[4936]: I0930 14:45:46.567133 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8034904a-14d8-4c91-b55d-98b7c824d8de-utilities\") pod \"certified-operators-l74zs\" (UID: \"8034904a-14d8-4c91-b55d-98b7c824d8de\") " pod="openshift-marketplace/certified-operators-l74zs" Sep 30 14:45:46 crc kubenswrapper[4936]: I0930 14:45:46.567221 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8034904a-14d8-4c91-b55d-98b7c824d8de-catalog-content\") pod \"certified-operators-l74zs\" (UID: \"8034904a-14d8-4c91-b55d-98b7c824d8de\") " pod="openshift-marketplace/certified-operators-l74zs" Sep 30 14:45:46 crc kubenswrapper[4936]: I0930 14:45:46.669033 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5wwp\" (UniqueName: \"kubernetes.io/projected/8034904a-14d8-4c91-b55d-98b7c824d8de-kube-api-access-m5wwp\") pod \"certified-operators-l74zs\" (UID: \"8034904a-14d8-4c91-b55d-98b7c824d8de\") " pod="openshift-marketplace/certified-operators-l74zs" Sep 30 14:45:46 crc kubenswrapper[4936]: I0930 14:45:46.669125 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8034904a-14d8-4c91-b55d-98b7c824d8de-utilities\") pod \"certified-operators-l74zs\" (UID: \"8034904a-14d8-4c91-b55d-98b7c824d8de\") " pod="openshift-marketplace/certified-operators-l74zs" Sep 30 14:45:46 crc kubenswrapper[4936]: I0930 14:45:46.669185 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8034904a-14d8-4c91-b55d-98b7c824d8de-catalog-content\") pod \"certified-operators-l74zs\" (UID: \"8034904a-14d8-4c91-b55d-98b7c824d8de\") " pod="openshift-marketplace/certified-operators-l74zs" Sep 30 14:45:46 crc kubenswrapper[4936]: I0930 14:45:46.669629 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8034904a-14d8-4c91-b55d-98b7c824d8de-catalog-content\") pod \"certified-operators-l74zs\" (UID: \"8034904a-14d8-4c91-b55d-98b7c824d8de\") " pod="openshift-marketplace/certified-operators-l74zs" Sep 30 14:45:46 crc kubenswrapper[4936]: I0930 14:45:46.669707 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8034904a-14d8-4c91-b55d-98b7c824d8de-utilities\") pod \"certified-operators-l74zs\" (UID: \"8034904a-14d8-4c91-b55d-98b7c824d8de\") " pod="openshift-marketplace/certified-operators-l74zs" Sep 30 14:45:46 crc kubenswrapper[4936]: I0930 14:45:46.713528 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5wwp\" (UniqueName: \"kubernetes.io/projected/8034904a-14d8-4c91-b55d-98b7c824d8de-kube-api-access-m5wwp\") pod \"certified-operators-l74zs\" (UID: \"8034904a-14d8-4c91-b55d-98b7c824d8de\") " pod="openshift-marketplace/certified-operators-l74zs" Sep 30 14:45:46 crc kubenswrapper[4936]: I0930 14:45:46.790575 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l74zs" Sep 30 14:45:47 crc kubenswrapper[4936]: I0930 14:45:47.351161 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l74zs"] Sep 30 14:45:48 crc kubenswrapper[4936]: I0930 14:45:48.020812 4936 generic.go:334] "Generic (PLEG): container finished" podID="8034904a-14d8-4c91-b55d-98b7c824d8de" containerID="9c93dd44f8323d5df531ff3c550fe12b4166bc0522df7ee046f2476afefac26b" exitCode=0 Sep 30 14:45:48 crc kubenswrapper[4936]: I0930 14:45:48.020965 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l74zs" event={"ID":"8034904a-14d8-4c91-b55d-98b7c824d8de","Type":"ContainerDied","Data":"9c93dd44f8323d5df531ff3c550fe12b4166bc0522df7ee046f2476afefac26b"} Sep 30 14:45:48 crc kubenswrapper[4936]: I0930 14:45:48.021472 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l74zs" event={"ID":"8034904a-14d8-4c91-b55d-98b7c824d8de","Type":"ContainerStarted","Data":"e6cb56e62041fc5ee68575f962ae7939f7588bf7340513fc0ecd6aad2ef4bb34"} Sep 30 14:45:48 crc kubenswrapper[4936]: I0930 14:45:48.029786 4936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:45:50 crc kubenswrapper[4936]: I0930 14:45:50.042213 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l74zs" event={"ID":"8034904a-14d8-4c91-b55d-98b7c824d8de","Type":"ContainerStarted","Data":"6333105bdd5c3ddec1a44e7d2fc6af3676ba308a50808fb0f611885222b9b612"} Sep 30 14:45:51 crc kubenswrapper[4936]: I0930 14:45:51.051668 4936 generic.go:334] "Generic (PLEG): container finished" podID="8034904a-14d8-4c91-b55d-98b7c824d8de" containerID="6333105bdd5c3ddec1a44e7d2fc6af3676ba308a50808fb0f611885222b9b612" exitCode=0 Sep 30 14:45:51 crc kubenswrapper[4936]: I0930 14:45:51.051746 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l74zs" event={"ID":"8034904a-14d8-4c91-b55d-98b7c824d8de","Type":"ContainerDied","Data":"6333105bdd5c3ddec1a44e7d2fc6af3676ba308a50808fb0f611885222b9b612"} Sep 30 14:45:52 crc kubenswrapper[4936]: I0930 14:45:52.068018 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l74zs" event={"ID":"8034904a-14d8-4c91-b55d-98b7c824d8de","Type":"ContainerStarted","Data":"0a2fc5afc35482bb1d8df4b73408ab59f2d16946a4b022939392b203f331e5d6"} Sep 30 14:45:52 crc kubenswrapper[4936]: I0930 14:45:52.091277 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l74zs" podStartSLOduration=2.654117746 podStartE2EDuration="6.091252328s" podCreationTimestamp="2025-09-30 14:45:46 +0000 UTC" firstStartedPulling="2025-09-30 14:45:48.029549852 +0000 UTC m=+3998.413552153" lastFinishedPulling="2025-09-30 14:45:51.466684434 +0000 UTC m=+4001.850686735" observedRunningTime="2025-09-30 14:45:52.085795379 +0000 UTC m=+4002.469797700" watchObservedRunningTime="2025-09-30 14:45:52.091252328 +0000 UTC m=+4002.475254629" Sep 30 14:45:56 crc kubenswrapper[4936]: I0930 14:45:56.791684 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l74zs" Sep 30 14:45:56 crc kubenswrapper[4936]: I0930 14:45:56.792323 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l74zs" Sep 30 14:45:56 crc kubenswrapper[4936]: I0930 14:45:56.845012 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l74zs" Sep 30 14:45:57 crc kubenswrapper[4936]: I0930 14:45:57.177955 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l74zs" Sep 30 14:45:57 crc kubenswrapper[4936]: I0930 14:45:57.236730 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l74zs"] Sep 30 14:45:59 crc kubenswrapper[4936]: I0930 14:45:59.130720 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l74zs" podUID="8034904a-14d8-4c91-b55d-98b7c824d8de" containerName="registry-server" containerID="cri-o://0a2fc5afc35482bb1d8df4b73408ab59f2d16946a4b022939392b203f331e5d6" gracePeriod=2 Sep 30 14:45:59 crc kubenswrapper[4936]: I0930 14:45:59.664040 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l74zs" Sep 30 14:45:59 crc kubenswrapper[4936]: I0930 14:45:59.774299 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8034904a-14d8-4c91-b55d-98b7c824d8de-utilities\") pod \"8034904a-14d8-4c91-b55d-98b7c824d8de\" (UID: \"8034904a-14d8-4c91-b55d-98b7c824d8de\") " Sep 30 14:45:59 crc kubenswrapper[4936]: I0930 14:45:59.774524 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8034904a-14d8-4c91-b55d-98b7c824d8de-catalog-content\") pod \"8034904a-14d8-4c91-b55d-98b7c824d8de\" (UID: \"8034904a-14d8-4c91-b55d-98b7c824d8de\") " Sep 30 14:45:59 crc kubenswrapper[4936]: I0930 14:45:59.774666 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5wwp\" (UniqueName: \"kubernetes.io/projected/8034904a-14d8-4c91-b55d-98b7c824d8de-kube-api-access-m5wwp\") pod \"8034904a-14d8-4c91-b55d-98b7c824d8de\" (UID: \"8034904a-14d8-4c91-b55d-98b7c824d8de\") " Sep 30 14:45:59 crc kubenswrapper[4936]: I0930 14:45:59.776009 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8034904a-14d8-4c91-b55d-98b7c824d8de-utilities" (OuterVolumeSpecName: "utilities") pod "8034904a-14d8-4c91-b55d-98b7c824d8de" (UID: "8034904a-14d8-4c91-b55d-98b7c824d8de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:45:59 crc kubenswrapper[4936]: I0930 14:45:59.781560 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8034904a-14d8-4c91-b55d-98b7c824d8de-kube-api-access-m5wwp" (OuterVolumeSpecName: "kube-api-access-m5wwp") pod "8034904a-14d8-4c91-b55d-98b7c824d8de" (UID: "8034904a-14d8-4c91-b55d-98b7c824d8de"). InnerVolumeSpecName "kube-api-access-m5wwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:45:59 crc kubenswrapper[4936]: I0930 14:45:59.819945 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8034904a-14d8-4c91-b55d-98b7c824d8de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8034904a-14d8-4c91-b55d-98b7c824d8de" (UID: "8034904a-14d8-4c91-b55d-98b7c824d8de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:45:59 crc kubenswrapper[4936]: I0930 14:45:59.877833 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8034904a-14d8-4c91-b55d-98b7c824d8de-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:45:59 crc kubenswrapper[4936]: I0930 14:45:59.877878 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5wwp\" (UniqueName: \"kubernetes.io/projected/8034904a-14d8-4c91-b55d-98b7c824d8de-kube-api-access-m5wwp\") on node \"crc\" DevicePath \"\"" Sep 30 14:45:59 crc kubenswrapper[4936]: I0930 14:45:59.877892 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8034904a-14d8-4c91-b55d-98b7c824d8de-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:46:00 crc kubenswrapper[4936]: I0930 14:46:00.157848 4936 generic.go:334] "Generic (PLEG): container finished" podID="8034904a-14d8-4c91-b55d-98b7c824d8de" containerID="0a2fc5afc35482bb1d8df4b73408ab59f2d16946a4b022939392b203f331e5d6" exitCode=0 Sep 30 14:46:00 crc kubenswrapper[4936]: I0930 14:46:00.157943 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l74zs" event={"ID":"8034904a-14d8-4c91-b55d-98b7c824d8de","Type":"ContainerDied","Data":"0a2fc5afc35482bb1d8df4b73408ab59f2d16946a4b022939392b203f331e5d6"} Sep 30 14:46:00 crc kubenswrapper[4936]: I0930 14:46:00.157986 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l74zs" event={"ID":"8034904a-14d8-4c91-b55d-98b7c824d8de","Type":"ContainerDied","Data":"e6cb56e62041fc5ee68575f962ae7939f7588bf7340513fc0ecd6aad2ef4bb34"} Sep 30 14:46:00 crc kubenswrapper[4936]: I0930 14:46:00.158033 4936 scope.go:117] "RemoveContainer" containerID="0a2fc5afc35482bb1d8df4b73408ab59f2d16946a4b022939392b203f331e5d6" Sep 30 14:46:00 crc kubenswrapper[4936]: I0930 14:46:00.158300 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l74zs" Sep 30 14:46:00 crc kubenswrapper[4936]: I0930 14:46:00.196489 4936 scope.go:117] "RemoveContainer" containerID="6333105bdd5c3ddec1a44e7d2fc6af3676ba308a50808fb0f611885222b9b612" Sep 30 14:46:00 crc kubenswrapper[4936]: I0930 14:46:00.223968 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l74zs"] Sep 30 14:46:00 crc kubenswrapper[4936]: I0930 14:46:00.230615 4936 scope.go:117] "RemoveContainer" containerID="9c93dd44f8323d5df531ff3c550fe12b4166bc0522df7ee046f2476afefac26b" Sep 30 14:46:00 crc kubenswrapper[4936]: I0930 14:46:00.235548 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l74zs"] Sep 30 14:46:00 crc kubenswrapper[4936]: I0930 14:46:00.300995 4936 scope.go:117] "RemoveContainer" containerID="0a2fc5afc35482bb1d8df4b73408ab59f2d16946a4b022939392b203f331e5d6" Sep 30 14:46:00 crc kubenswrapper[4936]: E0930 14:46:00.301546 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a2fc5afc35482bb1d8df4b73408ab59f2d16946a4b022939392b203f331e5d6\": container with ID starting with 0a2fc5afc35482bb1d8df4b73408ab59f2d16946a4b022939392b203f331e5d6 not found: ID does not exist" containerID="0a2fc5afc35482bb1d8df4b73408ab59f2d16946a4b022939392b203f331e5d6" Sep 30 14:46:00 crc kubenswrapper[4936]: I0930 14:46:00.301596 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2fc5afc35482bb1d8df4b73408ab59f2d16946a4b022939392b203f331e5d6"} err="failed to get container status \"0a2fc5afc35482bb1d8df4b73408ab59f2d16946a4b022939392b203f331e5d6\": rpc error: code = NotFound desc = could not find container \"0a2fc5afc35482bb1d8df4b73408ab59f2d16946a4b022939392b203f331e5d6\": container with ID starting with 0a2fc5afc35482bb1d8df4b73408ab59f2d16946a4b022939392b203f331e5d6 not found: ID does not exist" Sep 30 14:46:00 crc kubenswrapper[4936]: I0930 14:46:00.301628 4936 scope.go:117] "RemoveContainer" containerID="6333105bdd5c3ddec1a44e7d2fc6af3676ba308a50808fb0f611885222b9b612" Sep 30 14:46:00 crc kubenswrapper[4936]: E0930 14:46:00.301937 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6333105bdd5c3ddec1a44e7d2fc6af3676ba308a50808fb0f611885222b9b612\": container with ID starting with 6333105bdd5c3ddec1a44e7d2fc6af3676ba308a50808fb0f611885222b9b612 not found: ID does not exist" containerID="6333105bdd5c3ddec1a44e7d2fc6af3676ba308a50808fb0f611885222b9b612" Sep 30 14:46:00 crc kubenswrapper[4936]: I0930 14:46:00.301977 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6333105bdd5c3ddec1a44e7d2fc6af3676ba308a50808fb0f611885222b9b612"} err="failed to get container status \"6333105bdd5c3ddec1a44e7d2fc6af3676ba308a50808fb0f611885222b9b612\": rpc error: code = NotFound desc = could not find container \"6333105bdd5c3ddec1a44e7d2fc6af3676ba308a50808fb0f611885222b9b612\": container with ID starting with 6333105bdd5c3ddec1a44e7d2fc6af3676ba308a50808fb0f611885222b9b612 not found: ID does not exist" Sep 30 14:46:00 crc kubenswrapper[4936]: I0930 14:46:00.301992 4936 scope.go:117] "RemoveContainer" containerID="9c93dd44f8323d5df531ff3c550fe12b4166bc0522df7ee046f2476afefac26b" Sep 30 14:46:00 crc kubenswrapper[4936]: E0930 14:46:00.302319 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c93dd44f8323d5df531ff3c550fe12b4166bc0522df7ee046f2476afefac26b\": container with ID starting with 9c93dd44f8323d5df531ff3c550fe12b4166bc0522df7ee046f2476afefac26b not found: ID does not exist" containerID="9c93dd44f8323d5df531ff3c550fe12b4166bc0522df7ee046f2476afefac26b" Sep 30 14:46:00 crc kubenswrapper[4936]: I0930 14:46:00.302371 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c93dd44f8323d5df531ff3c550fe12b4166bc0522df7ee046f2476afefac26b"} err="failed to get container status \"9c93dd44f8323d5df531ff3c550fe12b4166bc0522df7ee046f2476afefac26b\": rpc error: code = NotFound desc = could not find container \"9c93dd44f8323d5df531ff3c550fe12b4166bc0522df7ee046f2476afefac26b\": container with ID starting with 9c93dd44f8323d5df531ff3c550fe12b4166bc0522df7ee046f2476afefac26b not found: ID does not exist" Sep 30 14:46:00 crc kubenswrapper[4936]: I0930 14:46:00.331116 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8034904a-14d8-4c91-b55d-98b7c824d8de" path="/var/lib/kubelet/pods/8034904a-14d8-4c91-b55d-98b7c824d8de/volumes" Sep 30 14:46:18 crc kubenswrapper[4936]: I0930 14:46:18.250698 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:46:18 crc kubenswrapper[4936]: I0930 14:46:18.251490 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:46:43 crc kubenswrapper[4936]: I0930 14:46:43.490320 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v8r9k"] Sep 30 14:46:43 crc kubenswrapper[4936]: E0930 14:46:43.492724 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8034904a-14d8-4c91-b55d-98b7c824d8de" containerName="extract-utilities" Sep 30 14:46:43 crc kubenswrapper[4936]: I0930 14:46:43.492833 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="8034904a-14d8-4c91-b55d-98b7c824d8de" containerName="extract-utilities" Sep 30 14:46:43 crc kubenswrapper[4936]: E0930 14:46:43.492924 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8034904a-14d8-4c91-b55d-98b7c824d8de" containerName="registry-server" Sep 30 14:46:43 crc kubenswrapper[4936]: I0930 14:46:43.492980 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="8034904a-14d8-4c91-b55d-98b7c824d8de" containerName="registry-server" Sep 30 14:46:43 crc kubenswrapper[4936]: E0930 14:46:43.493047 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8034904a-14d8-4c91-b55d-98b7c824d8de" containerName="extract-content" Sep 30 14:46:43 crc kubenswrapper[4936]: I0930 14:46:43.493101 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="8034904a-14d8-4c91-b55d-98b7c824d8de" containerName="extract-content" Sep 30 14:46:43 crc kubenswrapper[4936]: I0930 14:46:43.493428 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="8034904a-14d8-4c91-b55d-98b7c824d8de" containerName="registry-server" Sep 30 14:46:43 crc kubenswrapper[4936]: I0930 14:46:43.495324 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8r9k" Sep 30 14:46:43 crc kubenswrapper[4936]: I0930 14:46:43.509887 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d6bafd3-2894-4c6c-a58b-813170da525e-catalog-content\") pod \"redhat-operators-v8r9k\" (UID: \"2d6bafd3-2894-4c6c-a58b-813170da525e\") " pod="openshift-marketplace/redhat-operators-v8r9k" Sep 30 14:46:43 crc kubenswrapper[4936]: I0930 14:46:43.509969 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d6bafd3-2894-4c6c-a58b-813170da525e-utilities\") pod \"redhat-operators-v8r9k\" (UID: \"2d6bafd3-2894-4c6c-a58b-813170da525e\") " pod="openshift-marketplace/redhat-operators-v8r9k" Sep 30 14:46:43 crc kubenswrapper[4936]: I0930 14:46:43.510062 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pg8x\" (UniqueName: \"kubernetes.io/projected/2d6bafd3-2894-4c6c-a58b-813170da525e-kube-api-access-4pg8x\") pod \"redhat-operators-v8r9k\" (UID: \"2d6bafd3-2894-4c6c-a58b-813170da525e\") " pod="openshift-marketplace/redhat-operators-v8r9k" Sep 30 14:46:43 crc kubenswrapper[4936]: I0930 14:46:43.519601 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v8r9k"] Sep 30 14:46:43 crc kubenswrapper[4936]: I0930 14:46:43.611874 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pg8x\" (UniqueName: \"kubernetes.io/projected/2d6bafd3-2894-4c6c-a58b-813170da525e-kube-api-access-4pg8x\") pod \"redhat-operators-v8r9k\" (UID: \"2d6bafd3-2894-4c6c-a58b-813170da525e\") " pod="openshift-marketplace/redhat-operators-v8r9k" Sep 30 14:46:43 crc kubenswrapper[4936]: I0930 14:46:43.612008 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d6bafd3-2894-4c6c-a58b-813170da525e-catalog-content\") pod \"redhat-operators-v8r9k\" (UID: \"2d6bafd3-2894-4c6c-a58b-813170da525e\") " pod="openshift-marketplace/redhat-operators-v8r9k" Sep 30 14:46:43 crc kubenswrapper[4936]: I0930 14:46:43.612050 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d6bafd3-2894-4c6c-a58b-813170da525e-utilities\") pod \"redhat-operators-v8r9k\" (UID: \"2d6bafd3-2894-4c6c-a58b-813170da525e\") " pod="openshift-marketplace/redhat-operators-v8r9k" Sep 30 14:46:43 crc kubenswrapper[4936]: I0930 14:46:43.612712 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d6bafd3-2894-4c6c-a58b-813170da525e-catalog-content\") pod \"redhat-operators-v8r9k\" (UID: \"2d6bafd3-2894-4c6c-a58b-813170da525e\") " pod="openshift-marketplace/redhat-operators-v8r9k" Sep 30 14:46:43 crc kubenswrapper[4936]: I0930 14:46:43.612727 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d6bafd3-2894-4c6c-a58b-813170da525e-utilities\") pod \"redhat-operators-v8r9k\" (UID: \"2d6bafd3-2894-4c6c-a58b-813170da525e\") " pod="openshift-marketplace/redhat-operators-v8r9k" Sep 30 14:46:43 crc kubenswrapper[4936]: I0930 14:46:43.631766 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pg8x\" (UniqueName: \"kubernetes.io/projected/2d6bafd3-2894-4c6c-a58b-813170da525e-kube-api-access-4pg8x\") pod \"redhat-operators-v8r9k\" (UID: \"2d6bafd3-2894-4c6c-a58b-813170da525e\") " pod="openshift-marketplace/redhat-operators-v8r9k" Sep 30 14:46:43 crc kubenswrapper[4936]: I0930 14:46:43.837746 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8r9k" Sep 30 14:46:44 crc kubenswrapper[4936]: I0930 14:46:44.352506 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v8r9k"] Sep 30 14:46:44 crc kubenswrapper[4936]: W0930 14:46:44.365451 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d6bafd3_2894_4c6c_a58b_813170da525e.slice/crio-d7b81189cbd74275b3ce63a5d4aee06d62842ac7922aaca32a4d0c1c30a102e1 WatchSource:0}: Error finding container d7b81189cbd74275b3ce63a5d4aee06d62842ac7922aaca32a4d0c1c30a102e1: Status 404 returned error can't find the container with id d7b81189cbd74275b3ce63a5d4aee06d62842ac7922aaca32a4d0c1c30a102e1 Sep 30 14:46:44 crc kubenswrapper[4936]: I0930 14:46:44.658868 4936 generic.go:334] "Generic (PLEG): container finished" podID="2d6bafd3-2894-4c6c-a58b-813170da525e" containerID="72b7898e74ca60c4bf6b3f5583f1344bbbabafd73fe9fb7f6ef4d89bb4abd2ae" exitCode=0 Sep 30 14:46:44 crc kubenswrapper[4936]: I0930 14:46:44.659066 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8r9k" event={"ID":"2d6bafd3-2894-4c6c-a58b-813170da525e","Type":"ContainerDied","Data":"72b7898e74ca60c4bf6b3f5583f1344bbbabafd73fe9fb7f6ef4d89bb4abd2ae"} Sep 30 14:46:44 crc kubenswrapper[4936]: I0930 14:46:44.659295 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8r9k" event={"ID":"2d6bafd3-2894-4c6c-a58b-813170da525e","Type":"ContainerStarted","Data":"d7b81189cbd74275b3ce63a5d4aee06d62842ac7922aaca32a4d0c1c30a102e1"} Sep 30 14:46:46 crc kubenswrapper[4936]: I0930 14:46:46.680908 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8r9k" event={"ID":"2d6bafd3-2894-4c6c-a58b-813170da525e","Type":"ContainerStarted","Data":"a9c067ac64291aa5eb4dff3c9392dd331d3dcd6590ecaf61ba4b6f5073a167e9"} Sep 30 14:46:48 crc kubenswrapper[4936]: I0930 14:46:48.249793 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:46:48 crc kubenswrapper[4936]: I0930 14:46:48.251623 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:46:50 crc kubenswrapper[4936]: I0930 14:46:50.716941 4936 generic.go:334] "Generic (PLEG): container finished" podID="2d6bafd3-2894-4c6c-a58b-813170da525e" containerID="a9c067ac64291aa5eb4dff3c9392dd331d3dcd6590ecaf61ba4b6f5073a167e9" exitCode=0 Sep 30 14:46:50 crc kubenswrapper[4936]: I0930 14:46:50.717024 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8r9k" event={"ID":"2d6bafd3-2894-4c6c-a58b-813170da525e","Type":"ContainerDied","Data":"a9c067ac64291aa5eb4dff3c9392dd331d3dcd6590ecaf61ba4b6f5073a167e9"} Sep 30 14:46:51 crc kubenswrapper[4936]: I0930 14:46:51.729613 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8r9k" event={"ID":"2d6bafd3-2894-4c6c-a58b-813170da525e","Type":"ContainerStarted","Data":"f1c97510a71d56df89f934920fa613605d25290699323b78f206a6163162869e"} Sep 30 14:46:51 crc kubenswrapper[4936]: I0930 14:46:51.757232 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v8r9k" podStartSLOduration=2.302266595 podStartE2EDuration="8.757203628s" podCreationTimestamp="2025-09-30 14:46:43 +0000 UTC" firstStartedPulling="2025-09-30 14:46:44.661506498 +0000 UTC m=+4055.045508789" lastFinishedPulling="2025-09-30 14:46:51.116443521 +0000 UTC m=+4061.500445822" observedRunningTime="2025-09-30 14:46:51.748259074 +0000 UTC m=+4062.132261375" watchObservedRunningTime="2025-09-30 14:46:51.757203628 +0000 UTC m=+4062.141205929" Sep 30 14:46:53 crc kubenswrapper[4936]: I0930 14:46:53.838134 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v8r9k" Sep 30 14:46:53 crc kubenswrapper[4936]: I0930 14:46:53.838788 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v8r9k" Sep 30 14:46:54 crc kubenswrapper[4936]: I0930 14:46:54.885361 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v8r9k" podUID="2d6bafd3-2894-4c6c-a58b-813170da525e" containerName="registry-server" probeResult="failure" output=< Sep 30 14:46:54 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 14:46:54 crc kubenswrapper[4936]: > Sep 30 14:47:04 crc kubenswrapper[4936]: I0930 14:47:04.883878 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v8r9k" podUID="2d6bafd3-2894-4c6c-a58b-813170da525e" containerName="registry-server" probeResult="failure" output=< Sep 30 14:47:04 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 14:47:04 crc kubenswrapper[4936]: > Sep 30 14:47:13 crc kubenswrapper[4936]: I0930 14:47:13.887553 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v8r9k" Sep 30 14:47:13 crc kubenswrapper[4936]: I0930 14:47:13.944372 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v8r9k" Sep 30 14:47:16 crc kubenswrapper[4936]: I0930 14:47:16.696988 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v8r9k"] Sep 30 14:47:16 crc kubenswrapper[4936]: I0930 14:47:16.697824 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v8r9k" podUID="2d6bafd3-2894-4c6c-a58b-813170da525e" containerName="registry-server" containerID="cri-o://f1c97510a71d56df89f934920fa613605d25290699323b78f206a6163162869e" gracePeriod=2 Sep 30 14:47:16 crc kubenswrapper[4936]: I0930 14:47:16.961090 4936 generic.go:334] "Generic (PLEG): container finished" podID="2d6bafd3-2894-4c6c-a58b-813170da525e" containerID="f1c97510a71d56df89f934920fa613605d25290699323b78f206a6163162869e" exitCode=0 Sep 30 14:47:16 crc kubenswrapper[4936]: I0930 14:47:16.961170 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8r9k" event={"ID":"2d6bafd3-2894-4c6c-a58b-813170da525e","Type":"ContainerDied","Data":"f1c97510a71d56df89f934920fa613605d25290699323b78f206a6163162869e"} Sep 30 14:47:17 crc kubenswrapper[4936]: I0930 14:47:17.271774 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8r9k" Sep 30 14:47:17 crc kubenswrapper[4936]: I0930 14:47:17.368292 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d6bafd3-2894-4c6c-a58b-813170da525e-catalog-content\") pod \"2d6bafd3-2894-4c6c-a58b-813170da525e\" (UID: \"2d6bafd3-2894-4c6c-a58b-813170da525e\") " Sep 30 14:47:17 crc kubenswrapper[4936]: I0930 14:47:17.368373 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pg8x\" (UniqueName: \"kubernetes.io/projected/2d6bafd3-2894-4c6c-a58b-813170da525e-kube-api-access-4pg8x\") pod \"2d6bafd3-2894-4c6c-a58b-813170da525e\" (UID: \"2d6bafd3-2894-4c6c-a58b-813170da525e\") " Sep 30 14:47:17 crc kubenswrapper[4936]: I0930 14:47:17.368406 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d6bafd3-2894-4c6c-a58b-813170da525e-utilities\") pod \"2d6bafd3-2894-4c6c-a58b-813170da525e\" (UID: \"2d6bafd3-2894-4c6c-a58b-813170da525e\") " Sep 30 14:47:17 crc kubenswrapper[4936]: I0930 14:47:17.369631 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d6bafd3-2894-4c6c-a58b-813170da525e-utilities" (OuterVolumeSpecName: "utilities") pod "2d6bafd3-2894-4c6c-a58b-813170da525e" (UID: "2d6bafd3-2894-4c6c-a58b-813170da525e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:47:17 crc kubenswrapper[4936]: I0930 14:47:17.379205 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d6bafd3-2894-4c6c-a58b-813170da525e-kube-api-access-4pg8x" (OuterVolumeSpecName: "kube-api-access-4pg8x") pod "2d6bafd3-2894-4c6c-a58b-813170da525e" (UID: "2d6bafd3-2894-4c6c-a58b-813170da525e"). InnerVolumeSpecName "kube-api-access-4pg8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:47:17 crc kubenswrapper[4936]: I0930 14:47:17.457534 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d6bafd3-2894-4c6c-a58b-813170da525e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d6bafd3-2894-4c6c-a58b-813170da525e" (UID: "2d6bafd3-2894-4c6c-a58b-813170da525e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:47:17 crc kubenswrapper[4936]: I0930 14:47:17.471405 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d6bafd3-2894-4c6c-a58b-813170da525e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:47:17 crc kubenswrapper[4936]: I0930 14:47:17.471440 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pg8x\" (UniqueName: \"kubernetes.io/projected/2d6bafd3-2894-4c6c-a58b-813170da525e-kube-api-access-4pg8x\") on node \"crc\" DevicePath \"\"" Sep 30 14:47:17 crc kubenswrapper[4936]: I0930 14:47:17.471452 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d6bafd3-2894-4c6c-a58b-813170da525e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:47:18 crc kubenswrapper[4936]: I0930 14:47:17.972153 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8r9k" event={"ID":"2d6bafd3-2894-4c6c-a58b-813170da525e","Type":"ContainerDied","Data":"d7b81189cbd74275b3ce63a5d4aee06d62842ac7922aaca32a4d0c1c30a102e1"} Sep 30 14:47:18 crc kubenswrapper[4936]: I0930 14:47:17.972210 4936 scope.go:117] "RemoveContainer" containerID="f1c97510a71d56df89f934920fa613605d25290699323b78f206a6163162869e" Sep 30 14:47:18 crc kubenswrapper[4936]: I0930 14:47:17.972399 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8r9k" Sep 30 14:47:18 crc kubenswrapper[4936]: I0930 14:47:18.007309 4936 scope.go:117] "RemoveContainer" containerID="a9c067ac64291aa5eb4dff3c9392dd331d3dcd6590ecaf61ba4b6f5073a167e9" Sep 30 14:47:18 crc kubenswrapper[4936]: I0930 14:47:18.056854 4936 scope.go:117] "RemoveContainer" containerID="72b7898e74ca60c4bf6b3f5583f1344bbbabafd73fe9fb7f6ef4d89bb4abd2ae" Sep 30 14:47:18 crc kubenswrapper[4936]: I0930 14:47:18.056998 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v8r9k"] Sep 30 14:47:18 crc kubenswrapper[4936]: I0930 14:47:18.067325 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v8r9k"] Sep 30 14:47:18 crc kubenswrapper[4936]: I0930 14:47:18.250570 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:47:18 crc kubenswrapper[4936]: I0930 14:47:18.250636 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:47:18 crc kubenswrapper[4936]: I0930 14:47:18.250696 4936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 14:47:18 crc kubenswrapper[4936]: I0930 14:47:18.251746 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"60d0f6199ae64fedc8219dfc1bb7d66559a4c14d7fda332b82699bce909148d9"} pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:47:18 crc kubenswrapper[4936]: I0930 14:47:18.251811 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" containerID="cri-o://60d0f6199ae64fedc8219dfc1bb7d66559a4c14d7fda332b82699bce909148d9" gracePeriod=600 Sep 30 14:47:18 crc kubenswrapper[4936]: I0930 14:47:18.327020 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d6bafd3-2894-4c6c-a58b-813170da525e" path="/var/lib/kubelet/pods/2d6bafd3-2894-4c6c-a58b-813170da525e/volumes" Sep 30 14:47:18 crc kubenswrapper[4936]: I0930 14:47:18.984453 4936 generic.go:334] "Generic (PLEG): container finished" podID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerID="60d0f6199ae64fedc8219dfc1bb7d66559a4c14d7fda332b82699bce909148d9" exitCode=0 Sep 30 14:47:18 crc kubenswrapper[4936]: I0930 14:47:18.984536 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerDied","Data":"60d0f6199ae64fedc8219dfc1bb7d66559a4c14d7fda332b82699bce909148d9"} Sep 30 14:47:18 crc kubenswrapper[4936]: I0930 14:47:18.984861 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72"} Sep 30 14:47:18 crc kubenswrapper[4936]: I0930 14:47:18.984886 4936 scope.go:117] "RemoveContainer" containerID="32cd80a2a8c86616d67bbc7857a309d38bae677dceab9f97b4f62a9ecb80707a" Sep 30 14:49:18 crc kubenswrapper[4936]: I0930 14:49:18.250628 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:49:18 crc kubenswrapper[4936]: I0930 14:49:18.251748 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:49:48 crc kubenswrapper[4936]: I0930 14:49:48.250450 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:49:48 crc kubenswrapper[4936]: I0930 14:49:48.251016 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:50:18 crc kubenswrapper[4936]: I0930 14:50:18.287464 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:50:18 crc kubenswrapper[4936]: I0930 14:50:18.289219 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:50:18 crc kubenswrapper[4936]: I0930 14:50:18.289393 4936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 14:50:18 crc kubenswrapper[4936]: I0930 14:50:18.290559 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72"} pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:50:18 crc kubenswrapper[4936]: I0930 14:50:18.290699 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" containerID="cri-o://3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" gracePeriod=600 Sep 30 14:50:18 crc kubenswrapper[4936]: E0930 14:50:18.420910 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:50:18 crc kubenswrapper[4936]: I0930 14:50:18.550323 4936 generic.go:334] "Generic (PLEG): container finished" podID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" exitCode=0 Sep 30 14:50:18 crc kubenswrapper[4936]: I0930 14:50:18.550390 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerDied","Data":"3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72"} Sep 30 14:50:18 crc kubenswrapper[4936]: I0930 14:50:18.550806 4936 scope.go:117] "RemoveContainer" containerID="60d0f6199ae64fedc8219dfc1bb7d66559a4c14d7fda332b82699bce909148d9" Sep 30 14:50:18 crc kubenswrapper[4936]: I0930 14:50:18.552534 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:50:18 crc kubenswrapper[4936]: E0930 14:50:18.552917 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:50:33 crc kubenswrapper[4936]: I0930 14:50:33.316494 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:50:33 crc kubenswrapper[4936]: E0930 14:50:33.317295 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:50:48 crc kubenswrapper[4936]: I0930 14:50:48.319293 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:50:48 crc kubenswrapper[4936]: E0930 14:50:48.320017 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:51:00 crc kubenswrapper[4936]: I0930 14:51:00.323002 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:51:00 crc kubenswrapper[4936]: E0930 14:51:00.324035 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:51:12 crc kubenswrapper[4936]: I0930 14:51:12.316590 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:51:12 crc kubenswrapper[4936]: E0930 14:51:12.317515 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:51:24 crc kubenswrapper[4936]: I0930 14:51:24.316611 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:51:24 crc kubenswrapper[4936]: E0930 14:51:24.317393 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:51:34 crc kubenswrapper[4936]: I0930 14:51:34.165213 4936 generic.go:334] "Generic (PLEG): container finished" podID="87e335f7-bd98-45d0-a733-b2fc2dd3076e" containerID="b3be5ab12030d972a9d2d3d41911c4778b4f544b9b6f798fe40f380b751113dc" exitCode=0 Sep 30 14:51:34 crc kubenswrapper[4936]: I0930 14:51:34.165313 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"87e335f7-bd98-45d0-a733-b2fc2dd3076e","Type":"ContainerDied","Data":"b3be5ab12030d972a9d2d3d41911c4778b4f544b9b6f798fe40f380b751113dc"} Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.316958 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:51:35 crc kubenswrapper[4936]: E0930 14:51:35.319631 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.648398 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.728595 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.728685 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/87e335f7-bd98-45d0-a733-b2fc2dd3076e-ca-certs\") pod \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.728762 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/87e335f7-bd98-45d0-a733-b2fc2dd3076e-openstack-config\") pod \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.728788 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87e335f7-bd98-45d0-a733-b2fc2dd3076e-config-data\") pod \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.728819 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/87e335f7-bd98-45d0-a733-b2fc2dd3076e-test-operator-ephemeral-workdir\") pod \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.728840 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/87e335f7-bd98-45d0-a733-b2fc2dd3076e-openstack-config-secret\") pod \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.728906 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/87e335f7-bd98-45d0-a733-b2fc2dd3076e-test-operator-ephemeral-temporary\") pod \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.728945 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87e335f7-bd98-45d0-a733-b2fc2dd3076e-ssh-key\") pod \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.729020 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v64cj\" (UniqueName: \"kubernetes.io/projected/87e335f7-bd98-45d0-a733-b2fc2dd3076e-kube-api-access-v64cj\") pod \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\" (UID: \"87e335f7-bd98-45d0-a733-b2fc2dd3076e\") " Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.729928 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87e335f7-bd98-45d0-a733-b2fc2dd3076e-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "87e335f7-bd98-45d0-a733-b2fc2dd3076e" (UID: "87e335f7-bd98-45d0-a733-b2fc2dd3076e"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.733569 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87e335f7-bd98-45d0-a733-b2fc2dd3076e-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "87e335f7-bd98-45d0-a733-b2fc2dd3076e" (UID: "87e335f7-bd98-45d0-a733-b2fc2dd3076e"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.734201 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e335f7-bd98-45d0-a733-b2fc2dd3076e-config-data" (OuterVolumeSpecName: "config-data") pod "87e335f7-bd98-45d0-a733-b2fc2dd3076e" (UID: "87e335f7-bd98-45d0-a733-b2fc2dd3076e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.748704 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87e335f7-bd98-45d0-a733-b2fc2dd3076e-kube-api-access-v64cj" (OuterVolumeSpecName: "kube-api-access-v64cj") pod "87e335f7-bd98-45d0-a733-b2fc2dd3076e" (UID: "87e335f7-bd98-45d0-a733-b2fc2dd3076e"). InnerVolumeSpecName "kube-api-access-v64cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.753711 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "test-operator-logs") pod "87e335f7-bd98-45d0-a733-b2fc2dd3076e" (UID: "87e335f7-bd98-45d0-a733-b2fc2dd3076e"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.780272 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e335f7-bd98-45d0-a733-b2fc2dd3076e-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "87e335f7-bd98-45d0-a733-b2fc2dd3076e" (UID: "87e335f7-bd98-45d0-a733-b2fc2dd3076e"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.794459 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e335f7-bd98-45d0-a733-b2fc2dd3076e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "87e335f7-bd98-45d0-a733-b2fc2dd3076e" (UID: "87e335f7-bd98-45d0-a733-b2fc2dd3076e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.799083 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e335f7-bd98-45d0-a733-b2fc2dd3076e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "87e335f7-bd98-45d0-a733-b2fc2dd3076e" (UID: "87e335f7-bd98-45d0-a733-b2fc2dd3076e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.803647 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e335f7-bd98-45d0-a733-b2fc2dd3076e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "87e335f7-bd98-45d0-a733-b2fc2dd3076e" (UID: "87e335f7-bd98-45d0-a733-b2fc2dd3076e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.831113 4936 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/87e335f7-bd98-45d0-a733-b2fc2dd3076e-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.831156 4936 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87e335f7-bd98-45d0-a733-b2fc2dd3076e-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.831170 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v64cj\" (UniqueName: \"kubernetes.io/projected/87e335f7-bd98-45d0-a733-b2fc2dd3076e-kube-api-access-v64cj\") on node \"crc\" DevicePath \"\"" Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.832206 4936 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.832246 4936 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/87e335f7-bd98-45d0-a733-b2fc2dd3076e-ca-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.832262 4936 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/87e335f7-bd98-45d0-a733-b2fc2dd3076e-openstack-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.832277 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87e335f7-bd98-45d0-a733-b2fc2dd3076e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.832300 4936 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/87e335f7-bd98-45d0-a733-b2fc2dd3076e-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.832314 4936 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/87e335f7-bd98-45d0-a733-b2fc2dd3076e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.856828 4936 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Sep 30 14:51:35 crc kubenswrapper[4936]: I0930 14:51:35.934372 4936 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Sep 30 14:51:36 crc kubenswrapper[4936]: I0930 14:51:36.187145 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"87e335f7-bd98-45d0-a733-b2fc2dd3076e","Type":"ContainerDied","Data":"ec99baf5358c4ba9be85a2d451817edc3aba473498adcb848bfca6b6df0f0e24"} Sep 30 14:51:36 crc kubenswrapper[4936]: I0930 14:51:36.187418 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec99baf5358c4ba9be85a2d451817edc3aba473498adcb848bfca6b6df0f0e24" Sep 30 14:51:36 crc kubenswrapper[4936]: I0930 14:51:36.187208 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 14:51:38 crc kubenswrapper[4936]: I0930 14:51:38.753567 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 14:51:38 crc kubenswrapper[4936]: E0930 14:51:38.754201 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d6bafd3-2894-4c6c-a58b-813170da525e" containerName="registry-server" Sep 30 14:51:38 crc kubenswrapper[4936]: I0930 14:51:38.754213 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6bafd3-2894-4c6c-a58b-813170da525e" containerName="registry-server" Sep 30 14:51:38 crc kubenswrapper[4936]: E0930 14:51:38.754230 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d6bafd3-2894-4c6c-a58b-813170da525e" containerName="extract-utilities" Sep 30 14:51:38 crc kubenswrapper[4936]: I0930 14:51:38.754237 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6bafd3-2894-4c6c-a58b-813170da525e" containerName="extract-utilities" Sep 30 14:51:38 crc kubenswrapper[4936]: E0930 14:51:38.754248 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d6bafd3-2894-4c6c-a58b-813170da525e" containerName="extract-content" Sep 30 14:51:38 crc kubenswrapper[4936]: I0930 14:51:38.754254 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6bafd3-2894-4c6c-a58b-813170da525e" containerName="extract-content" Sep 30 14:51:38 crc kubenswrapper[4936]: E0930 14:51:38.754275 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87e335f7-bd98-45d0-a733-b2fc2dd3076e" containerName="tempest-tests-tempest-tests-runner" Sep 30 14:51:38 crc kubenswrapper[4936]: I0930 14:51:38.754281 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e335f7-bd98-45d0-a733-b2fc2dd3076e" containerName="tempest-tests-tempest-tests-runner" Sep 30 14:51:38 crc kubenswrapper[4936]: I0930 14:51:38.754454 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d6bafd3-2894-4c6c-a58b-813170da525e" containerName="registry-server" Sep 30 14:51:38 crc kubenswrapper[4936]: I0930 14:51:38.754467 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="87e335f7-bd98-45d0-a733-b2fc2dd3076e" containerName="tempest-tests-tempest-tests-runner" Sep 30 14:51:38 crc kubenswrapper[4936]: I0930 14:51:38.755045 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 14:51:38 crc kubenswrapper[4936]: I0930 14:51:38.757246 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-4zkrs" Sep 30 14:51:38 crc kubenswrapper[4936]: I0930 14:51:38.783156 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 14:51:38 crc kubenswrapper[4936]: I0930 14:51:38.905265 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7f8c038d-9623-40b0-b7d1-5a0f66caf6bd\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 14:51:38 crc kubenswrapper[4936]: I0930 14:51:38.906105 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhrrp\" (UniqueName: \"kubernetes.io/projected/7f8c038d-9623-40b0-b7d1-5a0f66caf6bd-kube-api-access-hhrrp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7f8c038d-9623-40b0-b7d1-5a0f66caf6bd\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 14:51:39 crc kubenswrapper[4936]: I0930 14:51:39.009155 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7f8c038d-9623-40b0-b7d1-5a0f66caf6bd\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 14:51:39 crc kubenswrapper[4936]: I0930 14:51:39.009267 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhrrp\" (UniqueName: \"kubernetes.io/projected/7f8c038d-9623-40b0-b7d1-5a0f66caf6bd-kube-api-access-hhrrp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7f8c038d-9623-40b0-b7d1-5a0f66caf6bd\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 14:51:39 crc kubenswrapper[4936]: I0930 14:51:39.009947 4936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7f8c038d-9623-40b0-b7d1-5a0f66caf6bd\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 14:51:39 crc kubenswrapper[4936]: I0930 14:51:39.049478 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhrrp\" (UniqueName: \"kubernetes.io/projected/7f8c038d-9623-40b0-b7d1-5a0f66caf6bd-kube-api-access-hhrrp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7f8c038d-9623-40b0-b7d1-5a0f66caf6bd\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 14:51:39 crc kubenswrapper[4936]: I0930 14:51:39.076265 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7f8c038d-9623-40b0-b7d1-5a0f66caf6bd\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 14:51:39 crc kubenswrapper[4936]: I0930 14:51:39.104554 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 14:51:39 crc kubenswrapper[4936]: I0930 14:51:39.625313 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 14:51:39 crc kubenswrapper[4936]: I0930 14:51:39.634036 4936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:51:40 crc kubenswrapper[4936]: I0930 14:51:40.225421 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"7f8c038d-9623-40b0-b7d1-5a0f66caf6bd","Type":"ContainerStarted","Data":"b50934736ab85d1d55b6b4e6f991cb58c9e8bac0577b1cc9dcb70b4f6bde3c92"} Sep 30 14:51:42 crc kubenswrapper[4936]: I0930 14:51:42.249431 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"7f8c038d-9623-40b0-b7d1-5a0f66caf6bd","Type":"ContainerStarted","Data":"d7eab72de4082ce32f155baca844dfd26ec7a0939b0c88b2fec482a22f892c15"} Sep 30 14:51:42 crc kubenswrapper[4936]: I0930 14:51:42.269466 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.552961691 podStartE2EDuration="4.26944614s" podCreationTimestamp="2025-09-30 14:51:38 +0000 UTC" firstStartedPulling="2025-09-30 14:51:39.633651235 +0000 UTC m=+4350.017653556" lastFinishedPulling="2025-09-30 14:51:41.350135704 +0000 UTC m=+4351.734138005" observedRunningTime="2025-09-30 14:51:42.261827618 +0000 UTC m=+4352.645829919" watchObservedRunningTime="2025-09-30 14:51:42.26944614 +0000 UTC m=+4352.653448441" Sep 30 14:51:48 crc kubenswrapper[4936]: I0930 14:51:48.316520 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:51:48 crc kubenswrapper[4936]: E0930 14:51:48.317310 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:51:59 crc kubenswrapper[4936]: I0930 14:51:59.239649 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wmvv9/must-gather-wf5rj"] Sep 30 14:51:59 crc kubenswrapper[4936]: I0930 14:51:59.241796 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmvv9/must-gather-wf5rj" Sep 30 14:51:59 crc kubenswrapper[4936]: I0930 14:51:59.258205 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wmvv9"/"openshift-service-ca.crt" Sep 30 14:51:59 crc kubenswrapper[4936]: I0930 14:51:59.258209 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-wmvv9"/"default-dockercfg-846h5" Sep 30 14:51:59 crc kubenswrapper[4936]: I0930 14:51:59.260110 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wmvv9"/"kube-root-ca.crt" Sep 30 14:51:59 crc kubenswrapper[4936]: I0930 14:51:59.279525 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wmvv9/must-gather-wf5rj"] Sep 30 14:51:59 crc kubenswrapper[4936]: I0930 14:51:59.316516 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:51:59 crc kubenswrapper[4936]: E0930 14:51:59.316834 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:51:59 crc kubenswrapper[4936]: I0930 14:51:59.373623 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bb09b42b-b116-4eaf-91f9-acbb3f472edc-must-gather-output\") pod \"must-gather-wf5rj\" (UID: \"bb09b42b-b116-4eaf-91f9-acbb3f472edc\") " pod="openshift-must-gather-wmvv9/must-gather-wf5rj" Sep 30 14:51:59 crc kubenswrapper[4936]: I0930 14:51:59.373708 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kp7m\" (UniqueName: \"kubernetes.io/projected/bb09b42b-b116-4eaf-91f9-acbb3f472edc-kube-api-access-6kp7m\") pod \"must-gather-wf5rj\" (UID: \"bb09b42b-b116-4eaf-91f9-acbb3f472edc\") " pod="openshift-must-gather-wmvv9/must-gather-wf5rj" Sep 30 14:51:59 crc kubenswrapper[4936]: I0930 14:51:59.476519 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bb09b42b-b116-4eaf-91f9-acbb3f472edc-must-gather-output\") pod \"must-gather-wf5rj\" (UID: \"bb09b42b-b116-4eaf-91f9-acbb3f472edc\") " pod="openshift-must-gather-wmvv9/must-gather-wf5rj" Sep 30 14:51:59 crc kubenswrapper[4936]: I0930 14:51:59.476608 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kp7m\" (UniqueName: \"kubernetes.io/projected/bb09b42b-b116-4eaf-91f9-acbb3f472edc-kube-api-access-6kp7m\") pod \"must-gather-wf5rj\" (UID: \"bb09b42b-b116-4eaf-91f9-acbb3f472edc\") " pod="openshift-must-gather-wmvv9/must-gather-wf5rj" Sep 30 14:51:59 crc kubenswrapper[4936]: I0930 14:51:59.476934 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bb09b42b-b116-4eaf-91f9-acbb3f472edc-must-gather-output\") pod \"must-gather-wf5rj\" (UID: \"bb09b42b-b116-4eaf-91f9-acbb3f472edc\") " pod="openshift-must-gather-wmvv9/must-gather-wf5rj" Sep 30 14:51:59 crc kubenswrapper[4936]: I0930 14:51:59.496729 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kp7m\" (UniqueName: \"kubernetes.io/projected/bb09b42b-b116-4eaf-91f9-acbb3f472edc-kube-api-access-6kp7m\") pod \"must-gather-wf5rj\" (UID: \"bb09b42b-b116-4eaf-91f9-acbb3f472edc\") " pod="openshift-must-gather-wmvv9/must-gather-wf5rj" Sep 30 14:51:59 crc kubenswrapper[4936]: I0930 14:51:59.565371 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmvv9/must-gather-wf5rj" Sep 30 14:52:00 crc kubenswrapper[4936]: I0930 14:52:00.066796 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wmvv9/must-gather-wf5rj"] Sep 30 14:52:00 crc kubenswrapper[4936]: I0930 14:52:00.408973 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmvv9/must-gather-wf5rj" event={"ID":"bb09b42b-b116-4eaf-91f9-acbb3f472edc","Type":"ContainerStarted","Data":"384dc58aa8c890e0fb156dc7cecf69df23a08d4afbe16f229bbc74025673ce62"} Sep 30 14:52:05 crc kubenswrapper[4936]: I0930 14:52:05.501528 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmvv9/must-gather-wf5rj" event={"ID":"bb09b42b-b116-4eaf-91f9-acbb3f472edc","Type":"ContainerStarted","Data":"669c3bc5bd85483968723725e7849b76776d3f63a19c482e2e67e5ae15fa28d5"} Sep 30 14:52:06 crc kubenswrapper[4936]: I0930 14:52:06.511240 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmvv9/must-gather-wf5rj" event={"ID":"bb09b42b-b116-4eaf-91f9-acbb3f472edc","Type":"ContainerStarted","Data":"cfd86fa9fdeeb2199192c4d84171105091af0558492a75c8f6ade580bf7bdccf"} Sep 30 14:52:06 crc kubenswrapper[4936]: I0930 14:52:06.531768 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wmvv9/must-gather-wf5rj" podStartSLOduration=2.532578028 podStartE2EDuration="7.531750277s" podCreationTimestamp="2025-09-30 14:51:59 +0000 UTC" firstStartedPulling="2025-09-30 14:52:00.080521697 +0000 UTC m=+4370.464523998" lastFinishedPulling="2025-09-30 14:52:05.079693946 +0000 UTC m=+4375.463696247" observedRunningTime="2025-09-30 14:52:06.53149405 +0000 UTC m=+4376.915496361" watchObservedRunningTime="2025-09-30 14:52:06.531750277 +0000 UTC m=+4376.915752578" Sep 30 14:52:11 crc kubenswrapper[4936]: I0930 14:52:11.210397 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wmvv9/crc-debug-27prt"] Sep 30 14:52:11 crc kubenswrapper[4936]: I0930 14:52:11.212102 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmvv9/crc-debug-27prt" Sep 30 14:52:11 crc kubenswrapper[4936]: I0930 14:52:11.337005 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdlt6\" (UniqueName: \"kubernetes.io/projected/e46151aa-c7f1-4555-a0ed-abb6e7f54d48-kube-api-access-jdlt6\") pod \"crc-debug-27prt\" (UID: \"e46151aa-c7f1-4555-a0ed-abb6e7f54d48\") " pod="openshift-must-gather-wmvv9/crc-debug-27prt" Sep 30 14:52:11 crc kubenswrapper[4936]: I0930 14:52:11.337470 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e46151aa-c7f1-4555-a0ed-abb6e7f54d48-host\") pod \"crc-debug-27prt\" (UID: \"e46151aa-c7f1-4555-a0ed-abb6e7f54d48\") " pod="openshift-must-gather-wmvv9/crc-debug-27prt" Sep 30 14:52:11 crc kubenswrapper[4936]: I0930 14:52:11.439365 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdlt6\" (UniqueName: \"kubernetes.io/projected/e46151aa-c7f1-4555-a0ed-abb6e7f54d48-kube-api-access-jdlt6\") pod \"crc-debug-27prt\" (UID: \"e46151aa-c7f1-4555-a0ed-abb6e7f54d48\") " pod="openshift-must-gather-wmvv9/crc-debug-27prt" Sep 30 14:52:11 crc kubenswrapper[4936]: I0930 14:52:11.439516 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e46151aa-c7f1-4555-a0ed-abb6e7f54d48-host\") pod \"crc-debug-27prt\" (UID: \"e46151aa-c7f1-4555-a0ed-abb6e7f54d48\") " pod="openshift-must-gather-wmvv9/crc-debug-27prt" Sep 30 14:52:11 crc kubenswrapper[4936]: I0930 14:52:11.439591 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e46151aa-c7f1-4555-a0ed-abb6e7f54d48-host\") pod \"crc-debug-27prt\" (UID: \"e46151aa-c7f1-4555-a0ed-abb6e7f54d48\") " pod="openshift-must-gather-wmvv9/crc-debug-27prt" Sep 30 14:52:11 crc kubenswrapper[4936]: I0930 14:52:11.457458 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdlt6\" (UniqueName: \"kubernetes.io/projected/e46151aa-c7f1-4555-a0ed-abb6e7f54d48-kube-api-access-jdlt6\") pod \"crc-debug-27prt\" (UID: \"e46151aa-c7f1-4555-a0ed-abb6e7f54d48\") " pod="openshift-must-gather-wmvv9/crc-debug-27prt" Sep 30 14:52:11 crc kubenswrapper[4936]: I0930 14:52:11.529151 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmvv9/crc-debug-27prt" Sep 30 14:52:11 crc kubenswrapper[4936]: W0930 14:52:11.569134 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode46151aa_c7f1_4555_a0ed_abb6e7f54d48.slice/crio-43c2dd0e9aeed43c3fe27e4ed4ead3ceaa6d9a4cf673a18044bfe7610558d084 WatchSource:0}: Error finding container 43c2dd0e9aeed43c3fe27e4ed4ead3ceaa6d9a4cf673a18044bfe7610558d084: Status 404 returned error can't find the container with id 43c2dd0e9aeed43c3fe27e4ed4ead3ceaa6d9a4cf673a18044bfe7610558d084 Sep 30 14:52:12 crc kubenswrapper[4936]: I0930 14:52:12.315760 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:52:12 crc kubenswrapper[4936]: E0930 14:52:12.316105 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:52:12 crc kubenswrapper[4936]: I0930 14:52:12.560053 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmvv9/crc-debug-27prt" event={"ID":"e46151aa-c7f1-4555-a0ed-abb6e7f54d48","Type":"ContainerStarted","Data":"43c2dd0e9aeed43c3fe27e4ed4ead3ceaa6d9a4cf673a18044bfe7610558d084"} Sep 30 14:52:24 crc kubenswrapper[4936]: I0930 14:52:24.706763 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmvv9/crc-debug-27prt" event={"ID":"e46151aa-c7f1-4555-a0ed-abb6e7f54d48","Type":"ContainerStarted","Data":"4ca6363ccf92f24196bfdf6d22a02543bd59982d239ec49ae1b8cd32d4ee8855"} Sep 30 14:52:25 crc kubenswrapper[4936]: I0930 14:52:25.315811 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:52:25 crc kubenswrapper[4936]: E0930 14:52:25.316374 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:52:40 crc kubenswrapper[4936]: I0930 14:52:40.325009 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:52:40 crc kubenswrapper[4936]: E0930 14:52:40.325891 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:52:55 crc kubenswrapper[4936]: I0930 14:52:55.315401 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:52:55 crc kubenswrapper[4936]: E0930 14:52:55.316119 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:53:07 crc kubenswrapper[4936]: I0930 14:53:07.315708 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:53:07 crc kubenswrapper[4936]: E0930 14:53:07.316285 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:53:20 crc kubenswrapper[4936]: I0930 14:53:20.322301 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:53:20 crc kubenswrapper[4936]: E0930 14:53:20.323040 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:53:24 crc kubenswrapper[4936]: I0930 14:53:24.817489 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wmvv9/crc-debug-27prt" podStartSLOduration=61.700839798 podStartE2EDuration="1m13.817469693s" podCreationTimestamp="2025-09-30 14:52:11 +0000 UTC" firstStartedPulling="2025-09-30 14:52:11.57165765 +0000 UTC m=+4381.955659951" lastFinishedPulling="2025-09-30 14:52:23.688287545 +0000 UTC m=+4394.072289846" observedRunningTime="2025-09-30 14:52:24.730872137 +0000 UTC m=+4395.114874438" watchObservedRunningTime="2025-09-30 14:53:24.817469693 +0000 UTC m=+4455.201471994" Sep 30 14:53:24 crc kubenswrapper[4936]: I0930 14:53:24.826437 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p7nhj"] Sep 30 14:53:24 crc kubenswrapper[4936]: I0930 14:53:24.829054 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p7nhj" Sep 30 14:53:24 crc kubenswrapper[4936]: I0930 14:53:24.899512 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p7nhj"] Sep 30 14:53:24 crc kubenswrapper[4936]: I0930 14:53:24.978538 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8n44\" (UniqueName: \"kubernetes.io/projected/891c5c09-35e3-4a25-bab8-9afd39f549b4-kube-api-access-d8n44\") pod \"redhat-marketplace-p7nhj\" (UID: \"891c5c09-35e3-4a25-bab8-9afd39f549b4\") " pod="openshift-marketplace/redhat-marketplace-p7nhj" Sep 30 14:53:24 crc kubenswrapper[4936]: I0930 14:53:24.978706 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/891c5c09-35e3-4a25-bab8-9afd39f549b4-utilities\") pod \"redhat-marketplace-p7nhj\" (UID: \"891c5c09-35e3-4a25-bab8-9afd39f549b4\") " pod="openshift-marketplace/redhat-marketplace-p7nhj" Sep 30 14:53:24 crc kubenswrapper[4936]: I0930 14:53:24.978748 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/891c5c09-35e3-4a25-bab8-9afd39f549b4-catalog-content\") pod \"redhat-marketplace-p7nhj\" (UID: \"891c5c09-35e3-4a25-bab8-9afd39f549b4\") " pod="openshift-marketplace/redhat-marketplace-p7nhj" Sep 30 14:53:25 crc kubenswrapper[4936]: I0930 14:53:25.080446 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8n44\" (UniqueName: \"kubernetes.io/projected/891c5c09-35e3-4a25-bab8-9afd39f549b4-kube-api-access-d8n44\") pod \"redhat-marketplace-p7nhj\" (UID: \"891c5c09-35e3-4a25-bab8-9afd39f549b4\") " pod="openshift-marketplace/redhat-marketplace-p7nhj" Sep 30 14:53:25 crc kubenswrapper[4936]: I0930 14:53:25.080906 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/891c5c09-35e3-4a25-bab8-9afd39f549b4-utilities\") pod \"redhat-marketplace-p7nhj\" (UID: \"891c5c09-35e3-4a25-bab8-9afd39f549b4\") " pod="openshift-marketplace/redhat-marketplace-p7nhj" Sep 30 14:53:25 crc kubenswrapper[4936]: I0930 14:53:25.080950 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/891c5c09-35e3-4a25-bab8-9afd39f549b4-catalog-content\") pod \"redhat-marketplace-p7nhj\" (UID: \"891c5c09-35e3-4a25-bab8-9afd39f549b4\") " pod="openshift-marketplace/redhat-marketplace-p7nhj" Sep 30 14:53:25 crc kubenswrapper[4936]: I0930 14:53:25.081318 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/891c5c09-35e3-4a25-bab8-9afd39f549b4-utilities\") pod \"redhat-marketplace-p7nhj\" (UID: \"891c5c09-35e3-4a25-bab8-9afd39f549b4\") " pod="openshift-marketplace/redhat-marketplace-p7nhj" Sep 30 14:53:25 crc kubenswrapper[4936]: I0930 14:53:25.081383 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/891c5c09-35e3-4a25-bab8-9afd39f549b4-catalog-content\") pod \"redhat-marketplace-p7nhj\" (UID: \"891c5c09-35e3-4a25-bab8-9afd39f549b4\") " pod="openshift-marketplace/redhat-marketplace-p7nhj" Sep 30 14:53:25 crc kubenswrapper[4936]: I0930 14:53:25.107217 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8n44\" (UniqueName: \"kubernetes.io/projected/891c5c09-35e3-4a25-bab8-9afd39f549b4-kube-api-access-d8n44\") pod \"redhat-marketplace-p7nhj\" (UID: \"891c5c09-35e3-4a25-bab8-9afd39f549b4\") " pod="openshift-marketplace/redhat-marketplace-p7nhj" Sep 30 14:53:25 crc kubenswrapper[4936]: I0930 14:53:25.153873 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p7nhj" Sep 30 14:53:25 crc kubenswrapper[4936]: I0930 14:53:25.932952 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p7nhj"] Sep 30 14:53:26 crc kubenswrapper[4936]: I0930 14:53:26.266297 4936 generic.go:334] "Generic (PLEG): container finished" podID="891c5c09-35e3-4a25-bab8-9afd39f549b4" containerID="36177f893736dac29c8c3bbe7766f535fed00c75e7f00f7d4cc044ca154acdea" exitCode=0 Sep 30 14:53:26 crc kubenswrapper[4936]: I0930 14:53:26.266407 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p7nhj" event={"ID":"891c5c09-35e3-4a25-bab8-9afd39f549b4","Type":"ContainerDied","Data":"36177f893736dac29c8c3bbe7766f535fed00c75e7f00f7d4cc044ca154acdea"} Sep 30 14:53:26 crc kubenswrapper[4936]: I0930 14:53:26.266641 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p7nhj" event={"ID":"891c5c09-35e3-4a25-bab8-9afd39f549b4","Type":"ContainerStarted","Data":"62f4897bd72f5b129e2f8f3653aab615b440a26269544eab8fe611ec7c6a281b"} Sep 30 14:53:28 crc kubenswrapper[4936]: I0930 14:53:28.296183 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p7nhj" event={"ID":"891c5c09-35e3-4a25-bab8-9afd39f549b4","Type":"ContainerStarted","Data":"bec5befcf72266937665f8207d96646d247e1a4d8236063c597ef187ad4523bc"} Sep 30 14:53:29 crc kubenswrapper[4936]: I0930 14:53:29.310351 4936 generic.go:334] "Generic (PLEG): container finished" podID="891c5c09-35e3-4a25-bab8-9afd39f549b4" containerID="bec5befcf72266937665f8207d96646d247e1a4d8236063c597ef187ad4523bc" exitCode=0 Sep 30 14:53:29 crc kubenswrapper[4936]: I0930 14:53:29.310760 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p7nhj" event={"ID":"891c5c09-35e3-4a25-bab8-9afd39f549b4","Type":"ContainerDied","Data":"bec5befcf72266937665f8207d96646d247e1a4d8236063c597ef187ad4523bc"} Sep 30 14:53:31 crc kubenswrapper[4936]: I0930 14:53:31.315240 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:53:31 crc kubenswrapper[4936]: E0930 14:53:31.316301 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:53:31 crc kubenswrapper[4936]: I0930 14:53:31.341681 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p7nhj" event={"ID":"891c5c09-35e3-4a25-bab8-9afd39f549b4","Type":"ContainerStarted","Data":"60b772266b390a227840fe27cd9882e4014eb2034c28d5c50e0a69813fc1a68e"} Sep 30 14:53:31 crc kubenswrapper[4936]: I0930 14:53:31.367074 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p7nhj" podStartSLOduration=3.777480628 podStartE2EDuration="7.367038996s" podCreationTimestamp="2025-09-30 14:53:24 +0000 UTC" firstStartedPulling="2025-09-30 14:53:26.268229288 +0000 UTC m=+4456.652231589" lastFinishedPulling="2025-09-30 14:53:29.857787656 +0000 UTC m=+4460.241789957" observedRunningTime="2025-09-30 14:53:31.360306389 +0000 UTC m=+4461.744308690" watchObservedRunningTime="2025-09-30 14:53:31.367038996 +0000 UTC m=+4461.751041297" Sep 30 14:53:35 crc kubenswrapper[4936]: I0930 14:53:35.155115 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p7nhj" Sep 30 14:53:35 crc kubenswrapper[4936]: I0930 14:53:35.155758 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p7nhj" Sep 30 14:53:35 crc kubenswrapper[4936]: I0930 14:53:35.205811 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p7nhj" Sep 30 14:53:35 crc kubenswrapper[4936]: I0930 14:53:35.429606 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p7nhj" Sep 30 14:53:35 crc kubenswrapper[4936]: I0930 14:53:35.814897 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p7nhj"] Sep 30 14:53:37 crc kubenswrapper[4936]: I0930 14:53:37.397789 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p7nhj" podUID="891c5c09-35e3-4a25-bab8-9afd39f549b4" containerName="registry-server" containerID="cri-o://60b772266b390a227840fe27cd9882e4014eb2034c28d5c50e0a69813fc1a68e" gracePeriod=2 Sep 30 14:53:38 crc kubenswrapper[4936]: I0930 14:53:38.421409 4936 generic.go:334] "Generic (PLEG): container finished" podID="891c5c09-35e3-4a25-bab8-9afd39f549b4" containerID="60b772266b390a227840fe27cd9882e4014eb2034c28d5c50e0a69813fc1a68e" exitCode=0 Sep 30 14:53:38 crc kubenswrapper[4936]: I0930 14:53:38.421721 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p7nhj" event={"ID":"891c5c09-35e3-4a25-bab8-9afd39f549b4","Type":"ContainerDied","Data":"60b772266b390a227840fe27cd9882e4014eb2034c28d5c50e0a69813fc1a68e"} Sep 30 14:53:38 crc kubenswrapper[4936]: I0930 14:53:38.421748 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p7nhj" event={"ID":"891c5c09-35e3-4a25-bab8-9afd39f549b4","Type":"ContainerDied","Data":"62f4897bd72f5b129e2f8f3653aab615b440a26269544eab8fe611ec7c6a281b"} Sep 30 14:53:38 crc kubenswrapper[4936]: I0930 14:53:38.421762 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62f4897bd72f5b129e2f8f3653aab615b440a26269544eab8fe611ec7c6a281b" Sep 30 14:53:38 crc kubenswrapper[4936]: I0930 14:53:38.488132 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p7nhj" Sep 30 14:53:38 crc kubenswrapper[4936]: I0930 14:53:38.587724 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8n44\" (UniqueName: \"kubernetes.io/projected/891c5c09-35e3-4a25-bab8-9afd39f549b4-kube-api-access-d8n44\") pod \"891c5c09-35e3-4a25-bab8-9afd39f549b4\" (UID: \"891c5c09-35e3-4a25-bab8-9afd39f549b4\") " Sep 30 14:53:38 crc kubenswrapper[4936]: I0930 14:53:38.587874 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/891c5c09-35e3-4a25-bab8-9afd39f549b4-utilities\") pod \"891c5c09-35e3-4a25-bab8-9afd39f549b4\" (UID: \"891c5c09-35e3-4a25-bab8-9afd39f549b4\") " Sep 30 14:53:38 crc kubenswrapper[4936]: I0930 14:53:38.587961 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/891c5c09-35e3-4a25-bab8-9afd39f549b4-catalog-content\") pod \"891c5c09-35e3-4a25-bab8-9afd39f549b4\" (UID: \"891c5c09-35e3-4a25-bab8-9afd39f549b4\") " Sep 30 14:53:38 crc kubenswrapper[4936]: I0930 14:53:38.589093 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/891c5c09-35e3-4a25-bab8-9afd39f549b4-utilities" (OuterVolumeSpecName: "utilities") pod "891c5c09-35e3-4a25-bab8-9afd39f549b4" (UID: "891c5c09-35e3-4a25-bab8-9afd39f549b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:53:38 crc kubenswrapper[4936]: I0930 14:53:38.593744 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/891c5c09-35e3-4a25-bab8-9afd39f549b4-kube-api-access-d8n44" (OuterVolumeSpecName: "kube-api-access-d8n44") pod "891c5c09-35e3-4a25-bab8-9afd39f549b4" (UID: "891c5c09-35e3-4a25-bab8-9afd39f549b4"). InnerVolumeSpecName "kube-api-access-d8n44". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:53:38 crc kubenswrapper[4936]: I0930 14:53:38.606056 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/891c5c09-35e3-4a25-bab8-9afd39f549b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "891c5c09-35e3-4a25-bab8-9afd39f549b4" (UID: "891c5c09-35e3-4a25-bab8-9afd39f549b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:53:38 crc kubenswrapper[4936]: I0930 14:53:38.689764 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/891c5c09-35e3-4a25-bab8-9afd39f549b4-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:38 crc kubenswrapper[4936]: I0930 14:53:38.689799 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/891c5c09-35e3-4a25-bab8-9afd39f549b4-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:38 crc kubenswrapper[4936]: I0930 14:53:38.689809 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8n44\" (UniqueName: \"kubernetes.io/projected/891c5c09-35e3-4a25-bab8-9afd39f549b4-kube-api-access-d8n44\") on node \"crc\" DevicePath \"\"" Sep 30 14:53:39 crc kubenswrapper[4936]: I0930 14:53:39.435277 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p7nhj" Sep 30 14:53:39 crc kubenswrapper[4936]: I0930 14:53:39.478305 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p7nhj"] Sep 30 14:53:39 crc kubenswrapper[4936]: I0930 14:53:39.488064 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p7nhj"] Sep 30 14:53:40 crc kubenswrapper[4936]: I0930 14:53:40.327126 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="891c5c09-35e3-4a25-bab8-9afd39f549b4" path="/var/lib/kubelet/pods/891c5c09-35e3-4a25-bab8-9afd39f549b4/volumes" Sep 30 14:53:44 crc kubenswrapper[4936]: I0930 14:53:44.315377 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:53:44 crc kubenswrapper[4936]: E0930 14:53:44.315984 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:53:59 crc kubenswrapper[4936]: I0930 14:53:59.317388 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:53:59 crc kubenswrapper[4936]: E0930 14:53:59.318768 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:54:00 crc kubenswrapper[4936]: I0930 14:54:00.919099 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-75d6545796-72m9v_68de6cb4-15c5-4c0e-b924-c2fff7f03eaf/barbican-api/0.log" Sep 30 14:54:01 crc kubenswrapper[4936]: I0930 14:54:01.027639 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-75d6545796-72m9v_68de6cb4-15c5-4c0e-b924-c2fff7f03eaf/barbican-api-log/0.log" Sep 30 14:54:01 crc kubenswrapper[4936]: I0930 14:54:01.225864 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-77c7874cdd-k7vml_6b5ee5ab-2208-44b0-a464-f813f6314c26/barbican-keystone-listener/0.log" Sep 30 14:54:01 crc kubenswrapper[4936]: I0930 14:54:01.324912 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-77c7874cdd-k7vml_6b5ee5ab-2208-44b0-a464-f813f6314c26/barbican-keystone-listener-log/0.log" Sep 30 14:54:01 crc kubenswrapper[4936]: I0930 14:54:01.453537 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-cf994c9f9-p76xq_c089d3fc-0428-4e46-8796-efa4f3df1fb6/barbican-worker/0.log" Sep 30 14:54:01 crc kubenswrapper[4936]: I0930 14:54:01.625401 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-cf994c9f9-p76xq_c089d3fc-0428-4e46-8796-efa4f3df1fb6/barbican-worker-log/0.log" Sep 30 14:54:01 crc kubenswrapper[4936]: I0930 14:54:01.805504 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k_418f655d-bae8-4905-8dfc-770612a750c4/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 14:54:02 crc kubenswrapper[4936]: I0930 14:54:02.035126 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_70edb384-47db-473c-95d3-28a20a1857e0/ceilometer-central-agent/0.log" Sep 30 14:54:02 crc kubenswrapper[4936]: I0930 14:54:02.125864 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_70edb384-47db-473c-95d3-28a20a1857e0/ceilometer-notification-agent/0.log" Sep 30 14:54:02 crc kubenswrapper[4936]: I0930 14:54:02.196489 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_70edb384-47db-473c-95d3-28a20a1857e0/proxy-httpd/0.log" Sep 30 14:54:02 crc kubenswrapper[4936]: I0930 14:54:02.300081 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_70edb384-47db-473c-95d3-28a20a1857e0/sg-core/0.log" Sep 30 14:54:02 crc kubenswrapper[4936]: I0930 14:54:02.448111 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp_12c8fabd-e2f8-4073-af4a-21fde9a45d85/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 14:54:02 crc kubenswrapper[4936]: I0930 14:54:02.606943 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q_11eb924e-ede4-4f91-a053-946c9951cf0e/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 14:54:02 crc kubenswrapper[4936]: I0930 14:54:02.963010 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_da5b20b7-ae2f-4d19-9f5f-f4c4404868aa/cinder-api/0.log" Sep 30 14:54:02 crc kubenswrapper[4936]: I0930 14:54:02.984841 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_da5b20b7-ae2f-4d19-9f5f-f4c4404868aa/cinder-api-log/0.log" Sep 30 14:54:03 crc kubenswrapper[4936]: I0930 14:54:03.372835 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_2d6eb814-5e27-493b-b63e-e8eddf561330/probe/0.log" Sep 30 14:54:03 crc kubenswrapper[4936]: I0930 14:54:03.455140 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_2d6eb814-5e27-493b-b63e-e8eddf561330/cinder-backup/0.log" Sep 30 14:54:03 crc kubenswrapper[4936]: I0930 14:54:03.615865 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8dbca145-2e46-484f-9676-17bde0b6fe26/cinder-scheduler/0.log" Sep 30 14:54:03 crc kubenswrapper[4936]: I0930 14:54:03.676720 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8dbca145-2e46-484f-9676-17bde0b6fe26/probe/0.log" Sep 30 14:54:03 crc kubenswrapper[4936]: I0930 14:54:03.931194 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_c4852707-8aff-49ed-b929-3bdcf9cd921a/cinder-volume/0.log" Sep 30 14:54:03 crc kubenswrapper[4936]: I0930 14:54:03.989270 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_c4852707-8aff-49ed-b929-3bdcf9cd921a/probe/0.log" Sep 30 14:54:04 crc kubenswrapper[4936]: I0930 14:54:04.191323 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-p46lf_9cab680a-f90c-4086-96f4-66c47ec4e497/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 14:54:04 crc kubenswrapper[4936]: I0930 14:54:04.304185 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2_ba550a84-9368-4cdb-8e5a-d474797cdd33/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 14:54:04 crc kubenswrapper[4936]: I0930 14:54:04.478948 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77766fdf55-rzbvr_e1481c32-550d-4129-856d-8bc79389c0d3/init/0.log" Sep 30 14:54:04 crc kubenswrapper[4936]: I0930 14:54:04.675048 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77766fdf55-rzbvr_e1481c32-550d-4129-856d-8bc79389c0d3/init/0.log" Sep 30 14:54:04 crc kubenswrapper[4936]: I0930 14:54:04.780306 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e9446dfe-8843-48eb-8514-bccc85f0727e/glance-httpd/0.log" Sep 30 14:54:04 crc kubenswrapper[4936]: I0930 14:54:04.891971 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77766fdf55-rzbvr_e1481c32-550d-4129-856d-8bc79389c0d3/dnsmasq-dns/0.log" Sep 30 14:54:05 crc kubenswrapper[4936]: I0930 14:54:05.004669 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e9446dfe-8843-48eb-8514-bccc85f0727e/glance-log/0.log" Sep 30 14:54:05 crc kubenswrapper[4936]: I0930 14:54:05.486253 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3f555e7b-a6ae-4ab9-b5f1-d89581768669/glance-log/0.log" Sep 30 14:54:05 crc kubenswrapper[4936]: I0930 14:54:05.507299 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3f555e7b-a6ae-4ab9-b5f1-d89581768669/glance-httpd/0.log" Sep 30 14:54:05 crc kubenswrapper[4936]: I0930 14:54:05.701675 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7b866fc884-w2td6_1e28ad1d-adf7-4316-9df6-db8a7c1e3933/horizon/2.log" Sep 30 14:54:05 crc kubenswrapper[4936]: I0930 14:54:05.897954 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7b866fc884-w2td6_1e28ad1d-adf7-4316-9df6-db8a7c1e3933/horizon/1.log" Sep 30 14:54:06 crc kubenswrapper[4936]: I0930 14:54:06.036447 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7b866fc884-w2td6_1e28ad1d-adf7-4316-9df6-db8a7c1e3933/horizon-log/0.log" Sep 30 14:54:06 crc kubenswrapper[4936]: I0930 14:54:06.071685 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-r27jw_9c5c3ed5-0905-48db-aab6-0d2489fc7d42/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 14:54:06 crc kubenswrapper[4936]: I0930 14:54:06.249146 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-6vjtl_73455681-e4b9-4313-991f-a00d4fab6d26/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 14:54:06 crc kubenswrapper[4936]: I0930 14:54:06.456761 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7f8b6d55dd-xrxpc_57c8a1c9-ff09-4e19-90d2-e7552e497695/keystone-api/0.log" Sep 30 14:54:06 crc kubenswrapper[4936]: I0930 14:54:06.506882 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320681-l9b2f_9476f1f4-61fa-4d56-a54b-cf28db2e0d47/keystone-cron/0.log" Sep 30 14:54:06 crc kubenswrapper[4936]: I0930 14:54:06.750764 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_fdabb8fe-7d22-4bd5-8676-378dc4500f6e/kube-state-metrics/0.log" Sep 30 14:54:06 crc kubenswrapper[4936]: I0930 14:54:06.871789 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4_acb77378-b2f6-48a5-b156-0c983ebde855/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 14:54:07 crc kubenswrapper[4936]: I0930 14:54:07.063581 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_b4389a75-ea04-4a03-97df-6063474dd74e/manila-api-log/0.log" Sep 30 14:54:07 crc kubenswrapper[4936]: I0930 14:54:07.103202 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_b4389a75-ea04-4a03-97df-6063474dd74e/manila-api/0.log" Sep 30 14:54:07 crc kubenswrapper[4936]: I0930 14:54:07.364468 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_8fa47185-cf71-4145-b46f-f524902914f3/manila-scheduler/0.log" Sep 30 14:54:07 crc kubenswrapper[4936]: I0930 14:54:07.377199 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_8fa47185-cf71-4145-b46f-f524902914f3/probe/0.log" Sep 30 14:54:07 crc kubenswrapper[4936]: I0930 14:54:07.463280 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_a1ecfa95-cc09-43e4-8d90-a65e4f6f74de/manila-share/0.log" Sep 30 14:54:07 crc kubenswrapper[4936]: I0930 14:54:07.632317 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_a1ecfa95-cc09-43e4-8d90-a65e4f6f74de/probe/0.log" Sep 30 14:54:08 crc kubenswrapper[4936]: I0930 14:54:08.109505 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-78b5b9486f-frfk9_9aa1f4b8-b399-4cf0-8d95-12a1eca674a7/neutron-api/0.log" Sep 30 14:54:08 crc kubenswrapper[4936]: I0930 14:54:08.606778 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-78b5b9486f-frfk9_9aa1f4b8-b399-4cf0-8d95-12a1eca674a7/neutron-httpd/0.log" Sep 30 14:54:08 crc kubenswrapper[4936]: I0930 14:54:08.793299 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg_bf6cd6ff-b9ca-4f66-8978-0394e03fe76c/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 14:54:09 crc kubenswrapper[4936]: I0930 14:54:09.758690 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_78596c14-d0a4-444c-8096-962a9359418a/nova-api-log/0.log" Sep 30 14:54:09 crc kubenswrapper[4936]: I0930 14:54:09.920601 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_78596c14-d0a4-444c-8096-962a9359418a/nova-api-api/0.log" Sep 30 14:54:10 crc kubenswrapper[4936]: I0930 14:54:10.119848 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae/nova-cell0-conductor-conductor/0.log" Sep 30 14:54:10 crc kubenswrapper[4936]: I0930 14:54:10.324896 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_d9ca401b-9423-44d6-a87a-b1d5cc37b381/nova-cell1-conductor-conductor/0.log" Sep 30 14:54:10 crc kubenswrapper[4936]: I0930 14:54:10.589600 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_54f05d80-47b0-406c-a0be-856756410f2a/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 14:54:10 crc kubenswrapper[4936]: I0930 14:54:10.827100 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl_32b191a4-92aa-4f6a-998e-0877753b109d/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 14:54:11 crc kubenswrapper[4936]: I0930 14:54:11.130000 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f5e4f5cf-48f7-4f5a-a503-cd4d57174087/nova-metadata-log/0.log" Sep 30 14:54:11 crc kubenswrapper[4936]: I0930 14:54:11.613259 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2544e332-54a0-46cc-8077-417e83eed982/mysql-bootstrap/0.log" Sep 30 14:54:11 crc kubenswrapper[4936]: I0930 14:54:11.647068 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_44f3fb8e-7a0f-4e09-877d-eb823eac2b78/nova-scheduler-scheduler/0.log" Sep 30 14:54:12 crc kubenswrapper[4936]: I0930 14:54:12.008380 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2544e332-54a0-46cc-8077-417e83eed982/mysql-bootstrap/0.log" Sep 30 14:54:12 crc kubenswrapper[4936]: I0930 14:54:12.041161 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2544e332-54a0-46cc-8077-417e83eed982/galera/0.log" Sep 30 14:54:12 crc kubenswrapper[4936]: I0930 14:54:12.290205 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4205821a-580b-4f4c-9e89-9fa6aae93378/mysql-bootstrap/0.log" Sep 30 14:54:12 crc kubenswrapper[4936]: I0930 14:54:12.700836 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4205821a-580b-4f4c-9e89-9fa6aae93378/galera/0.log" Sep 30 14:54:12 crc kubenswrapper[4936]: I0930 14:54:12.728898 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4205821a-580b-4f4c-9e89-9fa6aae93378/mysql-bootstrap/0.log" Sep 30 14:54:13 crc kubenswrapper[4936]: I0930 14:54:13.056409 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f5e4f5cf-48f7-4f5a-a503-cd4d57174087/nova-metadata-metadata/0.log" Sep 30 14:54:13 crc kubenswrapper[4936]: I0930 14:54:13.062312 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a0555978-f34e-4ada-9e39-513b4c199109/openstackclient/0.log" Sep 30 14:54:13 crc kubenswrapper[4936]: I0930 14:54:13.456132 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-k5lgl_491cf6ce-e945-4bd0-b811-b24eed9fcc12/ovn-controller/0.log" Sep 30 14:54:13 crc kubenswrapper[4936]: I0930 14:54:13.680494 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-krf5p_9443e0ef-1389-4d6a-a44d-f9863071e734/openstack-network-exporter/0.log" Sep 30 14:54:13 crc kubenswrapper[4936]: I0930 14:54:13.745622 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-747gv_2654d46a-f44e-45b2-862d-55e5eda229b7/ovsdb-server-init/0.log" Sep 30 14:54:14 crc kubenswrapper[4936]: I0930 14:54:14.132578 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-747gv_2654d46a-f44e-45b2-862d-55e5eda229b7/ovsdb-server-init/0.log" Sep 30 14:54:14 crc kubenswrapper[4936]: I0930 14:54:14.197136 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-747gv_2654d46a-f44e-45b2-862d-55e5eda229b7/ovsdb-server/0.log" Sep 30 14:54:14 crc kubenswrapper[4936]: I0930 14:54:14.200966 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-747gv_2654d46a-f44e-45b2-862d-55e5eda229b7/ovs-vswitchd/0.log" Sep 30 14:54:14 crc kubenswrapper[4936]: I0930 14:54:14.324094 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:54:14 crc kubenswrapper[4936]: E0930 14:54:14.324380 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:54:14 crc kubenswrapper[4936]: I0930 14:54:14.579008 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-sqxqk_70583a2b-3a7e-48fb-a59e-32778aee08fb/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 14:54:15 crc kubenswrapper[4936]: I0930 14:54:15.074583 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f1be0b97-e1de-4660-bd26-ec0a106cde3d/openstack-network-exporter/0.log" Sep 30 14:54:15 crc kubenswrapper[4936]: I0930 14:54:15.116985 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f1be0b97-e1de-4660-bd26-ec0a106cde3d/ovn-northd/0.log" Sep 30 14:54:15 crc kubenswrapper[4936]: I0930 14:54:15.355880 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c2a97be5-d38b-4352-883e-1efaf06ce24e/ovsdbserver-nb/0.log" Sep 30 14:54:15 crc kubenswrapper[4936]: I0930 14:54:15.451659 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c2a97be5-d38b-4352-883e-1efaf06ce24e/openstack-network-exporter/0.log" Sep 30 14:54:15 crc kubenswrapper[4936]: I0930 14:54:15.636458 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9d699954-e8fb-482d-83ea-a131998407a1/openstack-network-exporter/0.log" Sep 30 14:54:15 crc kubenswrapper[4936]: I0930 14:54:15.724413 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9d699954-e8fb-482d-83ea-a131998407a1/ovsdbserver-sb/0.log" Sep 30 14:54:16 crc kubenswrapper[4936]: I0930 14:54:16.059210 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-564967b568-8jhvh_471904cd-677c-4409-b641-15d34de36dbe/placement-api/0.log" Sep 30 14:54:16 crc kubenswrapper[4936]: I0930 14:54:16.134720 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-564967b568-8jhvh_471904cd-677c-4409-b641-15d34de36dbe/placement-log/0.log" Sep 30 14:54:16 crc kubenswrapper[4936]: I0930 14:54:16.360704 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ba18f440-0c9a-45d0-a1de-9f363bc654cf/setup-container/0.log" Sep 30 14:54:16 crc kubenswrapper[4936]: I0930 14:54:16.626739 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ba18f440-0c9a-45d0-a1de-9f363bc654cf/setup-container/0.log" Sep 30 14:54:16 crc kubenswrapper[4936]: I0930 14:54:16.793730 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ba18f440-0c9a-45d0-a1de-9f363bc654cf/rabbitmq/0.log" Sep 30 14:54:16 crc kubenswrapper[4936]: I0930 14:54:16.919052 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fd17158a-07d3-477e-8aa6-d03c3cb277c8/setup-container/0.log" Sep 30 14:54:17 crc kubenswrapper[4936]: I0930 14:54:17.109143 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fd17158a-07d3-477e-8aa6-d03c3cb277c8/setup-container/0.log" Sep 30 14:54:17 crc kubenswrapper[4936]: I0930 14:54:17.224713 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fd17158a-07d3-477e-8aa6-d03c3cb277c8/rabbitmq/0.log" Sep 30 14:54:17 crc kubenswrapper[4936]: I0930 14:54:17.934824 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2_d8b066d6-19ec-4267-8793-cfe95b74624f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 14:54:17 crc kubenswrapper[4936]: I0930 14:54:17.940648 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f_e4365ea1-ca48-47bf-af32-3e82c0a5da8f/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 14:54:18 crc kubenswrapper[4936]: I0930 14:54:18.332074 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-5jwct_3f95c4be-ca65-49a0-90f5-9b36926fe423/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 14:54:18 crc kubenswrapper[4936]: I0930 14:54:18.418747 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-cnrw6_05c30103-ea3f-41b7-82d1-73f43681e4e4/ssh-known-hosts-edpm-deployment/0.log" Sep 30 14:54:18 crc kubenswrapper[4936]: I0930 14:54:18.640364 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_87e335f7-bd98-45d0-a733-b2fc2dd3076e/tempest-tests-tempest-tests-runner/0.log" Sep 30 14:54:19 crc kubenswrapper[4936]: I0930 14:54:19.236082 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_7f8c038d-9623-40b0-b7d1-5a0f66caf6bd/test-operator-logs-container/0.log" Sep 30 14:54:19 crc kubenswrapper[4936]: I0930 14:54:19.453750 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-zskgc_d5817c4b-0566-4854-84c9-ad9a69b78172/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 14:54:26 crc kubenswrapper[4936]: I0930 14:54:26.315446 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:54:26 crc kubenswrapper[4936]: E0930 14:54:26.316067 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:54:30 crc kubenswrapper[4936]: I0930 14:54:30.036140 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_e3631d52-a6a9-46fc-b109-a8e0b96bac93/memcached/0.log" Sep 30 14:54:38 crc kubenswrapper[4936]: I0930 14:54:38.316169 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:54:38 crc kubenswrapper[4936]: E0930 14:54:38.316958 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:54:42 crc kubenswrapper[4936]: I0930 14:54:42.065798 4936 generic.go:334] "Generic (PLEG): container finished" podID="e46151aa-c7f1-4555-a0ed-abb6e7f54d48" containerID="4ca6363ccf92f24196bfdf6d22a02543bd59982d239ec49ae1b8cd32d4ee8855" exitCode=0 Sep 30 14:54:42 crc kubenswrapper[4936]: I0930 14:54:42.065973 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmvv9/crc-debug-27prt" event={"ID":"e46151aa-c7f1-4555-a0ed-abb6e7f54d48","Type":"ContainerDied","Data":"4ca6363ccf92f24196bfdf6d22a02543bd59982d239ec49ae1b8cd32d4ee8855"} Sep 30 14:54:43 crc kubenswrapper[4936]: I0930 14:54:43.199763 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmvv9/crc-debug-27prt" Sep 30 14:54:43 crc kubenswrapper[4936]: I0930 14:54:43.226983 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wmvv9/crc-debug-27prt"] Sep 30 14:54:43 crc kubenswrapper[4936]: I0930 14:54:43.235635 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wmvv9/crc-debug-27prt"] Sep 30 14:54:43 crc kubenswrapper[4936]: I0930 14:54:43.260455 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e46151aa-c7f1-4555-a0ed-abb6e7f54d48-host\") pod \"e46151aa-c7f1-4555-a0ed-abb6e7f54d48\" (UID: \"e46151aa-c7f1-4555-a0ed-abb6e7f54d48\") " Sep 30 14:54:43 crc kubenswrapper[4936]: I0930 14:54:43.260578 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdlt6\" (UniqueName: \"kubernetes.io/projected/e46151aa-c7f1-4555-a0ed-abb6e7f54d48-kube-api-access-jdlt6\") pod \"e46151aa-c7f1-4555-a0ed-abb6e7f54d48\" (UID: \"e46151aa-c7f1-4555-a0ed-abb6e7f54d48\") " Sep 30 14:54:43 crc kubenswrapper[4936]: I0930 14:54:43.260659 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e46151aa-c7f1-4555-a0ed-abb6e7f54d48-host" (OuterVolumeSpecName: "host") pod "e46151aa-c7f1-4555-a0ed-abb6e7f54d48" (UID: "e46151aa-c7f1-4555-a0ed-abb6e7f54d48"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:54:43 crc kubenswrapper[4936]: I0930 14:54:43.261140 4936 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e46151aa-c7f1-4555-a0ed-abb6e7f54d48-host\") on node \"crc\" DevicePath \"\"" Sep 30 14:54:43 crc kubenswrapper[4936]: I0930 14:54:43.267078 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e46151aa-c7f1-4555-a0ed-abb6e7f54d48-kube-api-access-jdlt6" (OuterVolumeSpecName: "kube-api-access-jdlt6") pod "e46151aa-c7f1-4555-a0ed-abb6e7f54d48" (UID: "e46151aa-c7f1-4555-a0ed-abb6e7f54d48"). InnerVolumeSpecName "kube-api-access-jdlt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:54:43 crc kubenswrapper[4936]: I0930 14:54:43.363915 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdlt6\" (UniqueName: \"kubernetes.io/projected/e46151aa-c7f1-4555-a0ed-abb6e7f54d48-kube-api-access-jdlt6\") on node \"crc\" DevicePath \"\"" Sep 30 14:54:44 crc kubenswrapper[4936]: I0930 14:54:44.090313 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43c2dd0e9aeed43c3fe27e4ed4ead3ceaa6d9a4cf673a18044bfe7610558d084" Sep 30 14:54:44 crc kubenswrapper[4936]: I0930 14:54:44.090398 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmvv9/crc-debug-27prt" Sep 30 14:54:44 crc kubenswrapper[4936]: I0930 14:54:44.328045 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e46151aa-c7f1-4555-a0ed-abb6e7f54d48" path="/var/lib/kubelet/pods/e46151aa-c7f1-4555-a0ed-abb6e7f54d48/volumes" Sep 30 14:54:44 crc kubenswrapper[4936]: I0930 14:54:44.486371 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wmvv9/crc-debug-q9k6m"] Sep 30 14:54:44 crc kubenswrapper[4936]: E0930 14:54:44.486853 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="891c5c09-35e3-4a25-bab8-9afd39f549b4" containerName="registry-server" Sep 30 14:54:44 crc kubenswrapper[4936]: I0930 14:54:44.486878 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="891c5c09-35e3-4a25-bab8-9afd39f549b4" containerName="registry-server" Sep 30 14:54:44 crc kubenswrapper[4936]: E0930 14:54:44.486897 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="891c5c09-35e3-4a25-bab8-9afd39f549b4" containerName="extract-utilities" Sep 30 14:54:44 crc kubenswrapper[4936]: I0930 14:54:44.486907 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="891c5c09-35e3-4a25-bab8-9afd39f549b4" containerName="extract-utilities" Sep 30 14:54:44 crc kubenswrapper[4936]: E0930 14:54:44.486919 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e46151aa-c7f1-4555-a0ed-abb6e7f54d48" containerName="container-00" Sep 30 14:54:44 crc kubenswrapper[4936]: I0930 14:54:44.486927 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="e46151aa-c7f1-4555-a0ed-abb6e7f54d48" containerName="container-00" Sep 30 14:54:44 crc kubenswrapper[4936]: E0930 14:54:44.486943 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="891c5c09-35e3-4a25-bab8-9afd39f549b4" containerName="extract-content" Sep 30 14:54:44 crc kubenswrapper[4936]: I0930 14:54:44.486951 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="891c5c09-35e3-4a25-bab8-9afd39f549b4" containerName="extract-content" Sep 30 14:54:44 crc kubenswrapper[4936]: I0930 14:54:44.487320 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="891c5c09-35e3-4a25-bab8-9afd39f549b4" containerName="registry-server" Sep 30 14:54:44 crc kubenswrapper[4936]: I0930 14:54:44.487424 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="e46151aa-c7f1-4555-a0ed-abb6e7f54d48" containerName="container-00" Sep 30 14:54:44 crc kubenswrapper[4936]: I0930 14:54:44.488275 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmvv9/crc-debug-q9k6m" Sep 30 14:54:44 crc kubenswrapper[4936]: I0930 14:54:44.585600 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7eb5a58-d661-4419-883b-10fb95df09d5-host\") pod \"crc-debug-q9k6m\" (UID: \"b7eb5a58-d661-4419-883b-10fb95df09d5\") " pod="openshift-must-gather-wmvv9/crc-debug-q9k6m" Sep 30 14:54:44 crc kubenswrapper[4936]: I0930 14:54:44.585688 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrmgb\" (UniqueName: \"kubernetes.io/projected/b7eb5a58-d661-4419-883b-10fb95df09d5-kube-api-access-wrmgb\") pod \"crc-debug-q9k6m\" (UID: \"b7eb5a58-d661-4419-883b-10fb95df09d5\") " pod="openshift-must-gather-wmvv9/crc-debug-q9k6m" Sep 30 14:54:44 crc kubenswrapper[4936]: I0930 14:54:44.688377 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrmgb\" (UniqueName: \"kubernetes.io/projected/b7eb5a58-d661-4419-883b-10fb95df09d5-kube-api-access-wrmgb\") pod \"crc-debug-q9k6m\" (UID: \"b7eb5a58-d661-4419-883b-10fb95df09d5\") " pod="openshift-must-gather-wmvv9/crc-debug-q9k6m" Sep 30 14:54:44 crc kubenswrapper[4936]: I0930 14:54:44.688624 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7eb5a58-d661-4419-883b-10fb95df09d5-host\") pod \"crc-debug-q9k6m\" (UID: \"b7eb5a58-d661-4419-883b-10fb95df09d5\") " pod="openshift-must-gather-wmvv9/crc-debug-q9k6m" Sep 30 14:54:44 crc kubenswrapper[4936]: I0930 14:54:44.688791 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7eb5a58-d661-4419-883b-10fb95df09d5-host\") pod \"crc-debug-q9k6m\" (UID: \"b7eb5a58-d661-4419-883b-10fb95df09d5\") " pod="openshift-must-gather-wmvv9/crc-debug-q9k6m" Sep 30 14:54:44 crc kubenswrapper[4936]: I0930 14:54:44.715948 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrmgb\" (UniqueName: \"kubernetes.io/projected/b7eb5a58-d661-4419-883b-10fb95df09d5-kube-api-access-wrmgb\") pod \"crc-debug-q9k6m\" (UID: \"b7eb5a58-d661-4419-883b-10fb95df09d5\") " pod="openshift-must-gather-wmvv9/crc-debug-q9k6m" Sep 30 14:54:44 crc kubenswrapper[4936]: I0930 14:54:44.816568 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmvv9/crc-debug-q9k6m" Sep 30 14:54:45 crc kubenswrapper[4936]: I0930 14:54:45.100985 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmvv9/crc-debug-q9k6m" event={"ID":"b7eb5a58-d661-4419-883b-10fb95df09d5","Type":"ContainerStarted","Data":"ee3cf0a741c7169ab58562adec1d123ee6791cd5269b7790e49f64d417009820"} Sep 30 14:54:45 crc kubenswrapper[4936]: I0930 14:54:45.101041 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmvv9/crc-debug-q9k6m" event={"ID":"b7eb5a58-d661-4419-883b-10fb95df09d5","Type":"ContainerStarted","Data":"f0a86decb584ba9f7d34c8b0949d46519660333933f6dfe443424d34aa146949"} Sep 30 14:54:45 crc kubenswrapper[4936]: I0930 14:54:45.120075 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wmvv9/crc-debug-q9k6m" podStartSLOduration=1.120050043 podStartE2EDuration="1.120050043s" podCreationTimestamp="2025-09-30 14:54:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:54:45.111078644 +0000 UTC m=+4535.495080945" watchObservedRunningTime="2025-09-30 14:54:45.120050043 +0000 UTC m=+4535.504052354" Sep 30 14:54:47 crc kubenswrapper[4936]: I0930 14:54:47.122325 4936 generic.go:334] "Generic (PLEG): container finished" podID="b7eb5a58-d661-4419-883b-10fb95df09d5" containerID="ee3cf0a741c7169ab58562adec1d123ee6791cd5269b7790e49f64d417009820" exitCode=0 Sep 30 14:54:47 crc kubenswrapper[4936]: I0930 14:54:47.122466 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmvv9/crc-debug-q9k6m" event={"ID":"b7eb5a58-d661-4419-883b-10fb95df09d5","Type":"ContainerDied","Data":"ee3cf0a741c7169ab58562adec1d123ee6791cd5269b7790e49f64d417009820"} Sep 30 14:54:47 crc kubenswrapper[4936]: I0930 14:54:47.283623 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cns66"] Sep 30 14:54:47 crc kubenswrapper[4936]: I0930 14:54:47.292002 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cns66" Sep 30 14:54:47 crc kubenswrapper[4936]: I0930 14:54:47.300320 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cns66"] Sep 30 14:54:47 crc kubenswrapper[4936]: I0930 14:54:47.332659 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpgxh\" (UniqueName: \"kubernetes.io/projected/bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed-kube-api-access-cpgxh\") pod \"community-operators-cns66\" (UID: \"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed\") " pod="openshift-marketplace/community-operators-cns66" Sep 30 14:54:47 crc kubenswrapper[4936]: I0930 14:54:47.332764 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed-utilities\") pod \"community-operators-cns66\" (UID: \"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed\") " pod="openshift-marketplace/community-operators-cns66" Sep 30 14:54:47 crc kubenswrapper[4936]: I0930 14:54:47.332866 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed-catalog-content\") pod \"community-operators-cns66\" (UID: \"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed\") " pod="openshift-marketplace/community-operators-cns66" Sep 30 14:54:47 crc kubenswrapper[4936]: I0930 14:54:47.434320 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed-catalog-content\") pod \"community-operators-cns66\" (UID: \"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed\") " pod="openshift-marketplace/community-operators-cns66" Sep 30 14:54:47 crc kubenswrapper[4936]: I0930 14:54:47.434453 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpgxh\" (UniqueName: \"kubernetes.io/projected/bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed-kube-api-access-cpgxh\") pod \"community-operators-cns66\" (UID: \"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed\") " pod="openshift-marketplace/community-operators-cns66" Sep 30 14:54:47 crc kubenswrapper[4936]: I0930 14:54:47.434505 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed-utilities\") pod \"community-operators-cns66\" (UID: \"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed\") " pod="openshift-marketplace/community-operators-cns66" Sep 30 14:54:47 crc kubenswrapper[4936]: I0930 14:54:47.434925 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed-catalog-content\") pod \"community-operators-cns66\" (UID: \"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed\") " pod="openshift-marketplace/community-operators-cns66" Sep 30 14:54:47 crc kubenswrapper[4936]: I0930 14:54:47.435225 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed-utilities\") pod \"community-operators-cns66\" (UID: \"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed\") " pod="openshift-marketplace/community-operators-cns66" Sep 30 14:54:47 crc kubenswrapper[4936]: I0930 14:54:47.671582 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpgxh\" (UniqueName: \"kubernetes.io/projected/bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed-kube-api-access-cpgxh\") pod \"community-operators-cns66\" (UID: \"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed\") " pod="openshift-marketplace/community-operators-cns66" Sep 30 14:54:47 crc kubenswrapper[4936]: I0930 14:54:47.919884 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cns66" Sep 30 14:54:48 crc kubenswrapper[4936]: I0930 14:54:48.304460 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmvv9/crc-debug-q9k6m" Sep 30 14:54:48 crc kubenswrapper[4936]: I0930 14:54:48.482608 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7eb5a58-d661-4419-883b-10fb95df09d5-host\") pod \"b7eb5a58-d661-4419-883b-10fb95df09d5\" (UID: \"b7eb5a58-d661-4419-883b-10fb95df09d5\") " Sep 30 14:54:48 crc kubenswrapper[4936]: I0930 14:54:48.482916 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7eb5a58-d661-4419-883b-10fb95df09d5-host" (OuterVolumeSpecName: "host") pod "b7eb5a58-d661-4419-883b-10fb95df09d5" (UID: "b7eb5a58-d661-4419-883b-10fb95df09d5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:54:48 crc kubenswrapper[4936]: I0930 14:54:48.483487 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrmgb\" (UniqueName: \"kubernetes.io/projected/b7eb5a58-d661-4419-883b-10fb95df09d5-kube-api-access-wrmgb\") pod \"b7eb5a58-d661-4419-883b-10fb95df09d5\" (UID: \"b7eb5a58-d661-4419-883b-10fb95df09d5\") " Sep 30 14:54:48 crc kubenswrapper[4936]: I0930 14:54:48.487964 4936 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7eb5a58-d661-4419-883b-10fb95df09d5-host\") on node \"crc\" DevicePath \"\"" Sep 30 14:54:48 crc kubenswrapper[4936]: I0930 14:54:48.494978 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7eb5a58-d661-4419-883b-10fb95df09d5-kube-api-access-wrmgb" (OuterVolumeSpecName: "kube-api-access-wrmgb") pod "b7eb5a58-d661-4419-883b-10fb95df09d5" (UID: "b7eb5a58-d661-4419-883b-10fb95df09d5"). InnerVolumeSpecName "kube-api-access-wrmgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:54:48 crc kubenswrapper[4936]: W0930 14:54:48.540521 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf9a5fc3_ce8f_4bec_bd44_fa429d3cbfed.slice/crio-fc25d1a387c6bad4f3c3a35da29b775f392a462ee3ace5ced2d66491709270d7 WatchSource:0}: Error finding container fc25d1a387c6bad4f3c3a35da29b775f392a462ee3ace5ced2d66491709270d7: Status 404 returned error can't find the container with id fc25d1a387c6bad4f3c3a35da29b775f392a462ee3ace5ced2d66491709270d7 Sep 30 14:54:48 crc kubenswrapper[4936]: I0930 14:54:48.553540 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cns66"] Sep 30 14:54:48 crc kubenswrapper[4936]: I0930 14:54:48.589786 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrmgb\" (UniqueName: \"kubernetes.io/projected/b7eb5a58-d661-4419-883b-10fb95df09d5-kube-api-access-wrmgb\") on node \"crc\" DevicePath \"\"" Sep 30 14:54:49 crc kubenswrapper[4936]: I0930 14:54:49.152512 4936 generic.go:334] "Generic (PLEG): container finished" podID="bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed" containerID="d7c51836cd224a8ff737607e7da0b4bc16b3347b273fbfa68276d6bbb3ca75c7" exitCode=0 Sep 30 14:54:49 crc kubenswrapper[4936]: I0930 14:54:49.152801 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cns66" event={"ID":"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed","Type":"ContainerDied","Data":"d7c51836cd224a8ff737607e7da0b4bc16b3347b273fbfa68276d6bbb3ca75c7"} Sep 30 14:54:49 crc kubenswrapper[4936]: I0930 14:54:49.152828 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cns66" event={"ID":"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed","Type":"ContainerStarted","Data":"fc25d1a387c6bad4f3c3a35da29b775f392a462ee3ace5ced2d66491709270d7"} Sep 30 14:54:49 crc kubenswrapper[4936]: I0930 14:54:49.159915 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmvv9/crc-debug-q9k6m" event={"ID":"b7eb5a58-d661-4419-883b-10fb95df09d5","Type":"ContainerDied","Data":"f0a86decb584ba9f7d34c8b0949d46519660333933f6dfe443424d34aa146949"} Sep 30 14:54:49 crc kubenswrapper[4936]: I0930 14:54:49.159965 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0a86decb584ba9f7d34c8b0949d46519660333933f6dfe443424d34aa146949" Sep 30 14:54:49 crc kubenswrapper[4936]: I0930 14:54:49.160045 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmvv9/crc-debug-q9k6m" Sep 30 14:54:50 crc kubenswrapper[4936]: I0930 14:54:50.170322 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cns66" event={"ID":"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed","Type":"ContainerStarted","Data":"81b5d7211dbcc968f216713297243b27f670af23bc0be4720018750be4d329e1"} Sep 30 14:54:50 crc kubenswrapper[4936]: I0930 14:54:50.325929 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:54:50 crc kubenswrapper[4936]: E0930 14:54:50.326158 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:54:51 crc kubenswrapper[4936]: I0930 14:54:51.900735 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wmvv9/crc-debug-q9k6m"] Sep 30 14:54:51 crc kubenswrapper[4936]: I0930 14:54:51.909323 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wmvv9/crc-debug-q9k6m"] Sep 30 14:54:52 crc kubenswrapper[4936]: I0930 14:54:52.203370 4936 generic.go:334] "Generic (PLEG): container finished" podID="bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed" containerID="81b5d7211dbcc968f216713297243b27f670af23bc0be4720018750be4d329e1" exitCode=0 Sep 30 14:54:52 crc kubenswrapper[4936]: I0930 14:54:52.203705 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cns66" event={"ID":"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed","Type":"ContainerDied","Data":"81b5d7211dbcc968f216713297243b27f670af23bc0be4720018750be4d329e1"} Sep 30 14:54:52 crc kubenswrapper[4936]: I0930 14:54:52.327915 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7eb5a58-d661-4419-883b-10fb95df09d5" path="/var/lib/kubelet/pods/b7eb5a58-d661-4419-883b-10fb95df09d5/volumes" Sep 30 14:54:53 crc kubenswrapper[4936]: I0930 14:54:53.132275 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wmvv9/crc-debug-r6kxw"] Sep 30 14:54:53 crc kubenswrapper[4936]: E0930 14:54:53.132974 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7eb5a58-d661-4419-883b-10fb95df09d5" containerName="container-00" Sep 30 14:54:53 crc kubenswrapper[4936]: I0930 14:54:53.132987 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7eb5a58-d661-4419-883b-10fb95df09d5" containerName="container-00" Sep 30 14:54:53 crc kubenswrapper[4936]: I0930 14:54:53.133203 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7eb5a58-d661-4419-883b-10fb95df09d5" containerName="container-00" Sep 30 14:54:53 crc kubenswrapper[4936]: I0930 14:54:53.133818 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmvv9/crc-debug-r6kxw" Sep 30 14:54:53 crc kubenswrapper[4936]: I0930 14:54:53.197855 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a524b2df-691f-428c-8011-889c25d874c0-host\") pod \"crc-debug-r6kxw\" (UID: \"a524b2df-691f-428c-8011-889c25d874c0\") " pod="openshift-must-gather-wmvv9/crc-debug-r6kxw" Sep 30 14:54:53 crc kubenswrapper[4936]: I0930 14:54:53.197961 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29992\" (UniqueName: \"kubernetes.io/projected/a524b2df-691f-428c-8011-889c25d874c0-kube-api-access-29992\") pod \"crc-debug-r6kxw\" (UID: \"a524b2df-691f-428c-8011-889c25d874c0\") " pod="openshift-must-gather-wmvv9/crc-debug-r6kxw" Sep 30 14:54:53 crc kubenswrapper[4936]: I0930 14:54:53.215060 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cns66" event={"ID":"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed","Type":"ContainerStarted","Data":"5e58a49a04154cfa5f2b16a35af8098a235aeac30de2baa4fdfbf25485b02370"} Sep 30 14:54:53 crc kubenswrapper[4936]: I0930 14:54:53.233924 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cns66" podStartSLOduration=2.598631686 podStartE2EDuration="6.233904636s" podCreationTimestamp="2025-09-30 14:54:47 +0000 UTC" firstStartedPulling="2025-09-30 14:54:49.155312507 +0000 UTC m=+4539.539314808" lastFinishedPulling="2025-09-30 14:54:52.790585457 +0000 UTC m=+4543.174587758" observedRunningTime="2025-09-30 14:54:53.229866614 +0000 UTC m=+4543.613868925" watchObservedRunningTime="2025-09-30 14:54:53.233904636 +0000 UTC m=+4543.617906937" Sep 30 14:54:53 crc kubenswrapper[4936]: I0930 14:54:53.300027 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a524b2df-691f-428c-8011-889c25d874c0-host\") pod \"crc-debug-r6kxw\" (UID: \"a524b2df-691f-428c-8011-889c25d874c0\") " pod="openshift-must-gather-wmvv9/crc-debug-r6kxw" Sep 30 14:54:53 crc kubenswrapper[4936]: I0930 14:54:53.300153 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a524b2df-691f-428c-8011-889c25d874c0-host\") pod \"crc-debug-r6kxw\" (UID: \"a524b2df-691f-428c-8011-889c25d874c0\") " pod="openshift-must-gather-wmvv9/crc-debug-r6kxw" Sep 30 14:54:53 crc kubenswrapper[4936]: I0930 14:54:53.300604 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29992\" (UniqueName: \"kubernetes.io/projected/a524b2df-691f-428c-8011-889c25d874c0-kube-api-access-29992\") pod \"crc-debug-r6kxw\" (UID: \"a524b2df-691f-428c-8011-889c25d874c0\") " pod="openshift-must-gather-wmvv9/crc-debug-r6kxw" Sep 30 14:54:53 crc kubenswrapper[4936]: I0930 14:54:53.319696 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29992\" (UniqueName: \"kubernetes.io/projected/a524b2df-691f-428c-8011-889c25d874c0-kube-api-access-29992\") pod \"crc-debug-r6kxw\" (UID: \"a524b2df-691f-428c-8011-889c25d874c0\") " pod="openshift-must-gather-wmvv9/crc-debug-r6kxw" Sep 30 14:54:53 crc kubenswrapper[4936]: I0930 14:54:53.453615 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmvv9/crc-debug-r6kxw" Sep 30 14:54:54 crc kubenswrapper[4936]: I0930 14:54:54.228747 4936 generic.go:334] "Generic (PLEG): container finished" podID="a524b2df-691f-428c-8011-889c25d874c0" containerID="efa2dd0c0d1017708432bc8806b190d183190b67f2c046e6dda71f470529bdf0" exitCode=0 Sep 30 14:54:54 crc kubenswrapper[4936]: I0930 14:54:54.228832 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmvv9/crc-debug-r6kxw" event={"ID":"a524b2df-691f-428c-8011-889c25d874c0","Type":"ContainerDied","Data":"efa2dd0c0d1017708432bc8806b190d183190b67f2c046e6dda71f470529bdf0"} Sep 30 14:54:54 crc kubenswrapper[4936]: I0930 14:54:54.229653 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmvv9/crc-debug-r6kxw" event={"ID":"a524b2df-691f-428c-8011-889c25d874c0","Type":"ContainerStarted","Data":"a33e7769bb4147fc2f8d31efdadb85ecb3e06f395d5f7738e947cd9ec056440a"} Sep 30 14:54:54 crc kubenswrapper[4936]: I0930 14:54:54.270212 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wmvv9/crc-debug-r6kxw"] Sep 30 14:54:54 crc kubenswrapper[4936]: I0930 14:54:54.281443 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wmvv9/crc-debug-r6kxw"] Sep 30 14:54:55 crc kubenswrapper[4936]: I0930 14:54:55.675052 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmvv9/crc-debug-r6kxw" Sep 30 14:54:55 crc kubenswrapper[4936]: I0930 14:54:55.751191 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29992\" (UniqueName: \"kubernetes.io/projected/a524b2df-691f-428c-8011-889c25d874c0-kube-api-access-29992\") pod \"a524b2df-691f-428c-8011-889c25d874c0\" (UID: \"a524b2df-691f-428c-8011-889c25d874c0\") " Sep 30 14:54:55 crc kubenswrapper[4936]: I0930 14:54:55.751468 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a524b2df-691f-428c-8011-889c25d874c0-host\") pod \"a524b2df-691f-428c-8011-889c25d874c0\" (UID: \"a524b2df-691f-428c-8011-889c25d874c0\") " Sep 30 14:54:55 crc kubenswrapper[4936]: I0930 14:54:55.751640 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a524b2df-691f-428c-8011-889c25d874c0-host" (OuterVolumeSpecName: "host") pod "a524b2df-691f-428c-8011-889c25d874c0" (UID: "a524b2df-691f-428c-8011-889c25d874c0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:54:55 crc kubenswrapper[4936]: I0930 14:54:55.752402 4936 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a524b2df-691f-428c-8011-889c25d874c0-host\") on node \"crc\" DevicePath \"\"" Sep 30 14:54:55 crc kubenswrapper[4936]: I0930 14:54:55.757687 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a524b2df-691f-428c-8011-889c25d874c0-kube-api-access-29992" (OuterVolumeSpecName: "kube-api-access-29992") pod "a524b2df-691f-428c-8011-889c25d874c0" (UID: "a524b2df-691f-428c-8011-889c25d874c0"). InnerVolumeSpecName "kube-api-access-29992". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:54:55 crc kubenswrapper[4936]: I0930 14:54:55.854579 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29992\" (UniqueName: \"kubernetes.io/projected/a524b2df-691f-428c-8011-889c25d874c0-kube-api-access-29992\") on node \"crc\" DevicePath \"\"" Sep 30 14:54:56 crc kubenswrapper[4936]: I0930 14:54:56.248204 4936 scope.go:117] "RemoveContainer" containerID="efa2dd0c0d1017708432bc8806b190d183190b67f2c046e6dda71f470529bdf0" Sep 30 14:54:56 crc kubenswrapper[4936]: I0930 14:54:56.248230 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmvv9/crc-debug-r6kxw" Sep 30 14:54:56 crc kubenswrapper[4936]: I0930 14:54:56.332845 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a524b2df-691f-428c-8011-889c25d874c0" path="/var/lib/kubelet/pods/a524b2df-691f-428c-8011-889c25d874c0/volumes" Sep 30 14:54:57 crc kubenswrapper[4936]: I0930 14:54:57.002400 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2_286303d8-20ec-45c4-86cf-3da1af48f329/util/0.log" Sep 30 14:54:57 crc kubenswrapper[4936]: I0930 14:54:57.643084 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2_286303d8-20ec-45c4-86cf-3da1af48f329/pull/0.log" Sep 30 14:54:57 crc kubenswrapper[4936]: I0930 14:54:57.670291 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2_286303d8-20ec-45c4-86cf-3da1af48f329/pull/0.log" Sep 30 14:54:57 crc kubenswrapper[4936]: I0930 14:54:57.742551 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2_286303d8-20ec-45c4-86cf-3da1af48f329/util/0.log" Sep 30 14:54:57 crc kubenswrapper[4936]: I0930 14:54:57.885392 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2_286303d8-20ec-45c4-86cf-3da1af48f329/util/0.log" Sep 30 14:54:57 crc kubenswrapper[4936]: I0930 14:54:57.914389 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2_286303d8-20ec-45c4-86cf-3da1af48f329/pull/0.log" Sep 30 14:54:57 crc kubenswrapper[4936]: I0930 14:54:57.916496 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2_286303d8-20ec-45c4-86cf-3da1af48f329/extract/0.log" Sep 30 14:54:57 crc kubenswrapper[4936]: I0930 14:54:57.921241 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cns66" Sep 30 14:54:57 crc kubenswrapper[4936]: I0930 14:54:57.921287 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cns66" Sep 30 14:54:57 crc kubenswrapper[4936]: I0930 14:54:57.973629 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cns66" Sep 30 14:54:58 crc kubenswrapper[4936]: I0930 14:54:58.100788 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-x5g5r_650ff8e9-279f-41ff-8bb8-1880e7cf985c/kube-rbac-proxy/0.log" Sep 30 14:54:58 crc kubenswrapper[4936]: I0930 14:54:58.213202 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-6c8cz_7cc732ee-78f8-4d20-aac8-67ab10b944d3/kube-rbac-proxy/0.log" Sep 30 14:54:58 crc kubenswrapper[4936]: I0930 14:54:58.227493 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-x5g5r_650ff8e9-279f-41ff-8bb8-1880e7cf985c/manager/0.log" Sep 30 14:54:58 crc kubenswrapper[4936]: I0930 14:54:58.326582 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cns66" Sep 30 14:54:58 crc kubenswrapper[4936]: I0930 14:54:58.381742 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cns66"] Sep 30 14:54:58 crc kubenswrapper[4936]: I0930 14:54:58.397583 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-6c8cz_7cc732ee-78f8-4d20-aac8-67ab10b944d3/manager/0.log" Sep 30 14:54:58 crc kubenswrapper[4936]: I0930 14:54:58.429927 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-xz9zg_9d8425ad-dcdc-4d31-9a5c-9461adb3296c/kube-rbac-proxy/0.log" Sep 30 14:54:58 crc kubenswrapper[4936]: I0930 14:54:58.497213 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-xz9zg_9d8425ad-dcdc-4d31-9a5c-9461adb3296c/manager/0.log" Sep 30 14:54:58 crc kubenswrapper[4936]: I0930 14:54:58.687531 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-fk2zk_bd9be0ef-9048-4e0a-b8d7-1b29b450984f/kube-rbac-proxy/0.log" Sep 30 14:54:58 crc kubenswrapper[4936]: I0930 14:54:58.849709 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-fk2zk_bd9be0ef-9048-4e0a-b8d7-1b29b450984f/manager/0.log" Sep 30 14:54:58 crc kubenswrapper[4936]: I0930 14:54:58.913367 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-g5k7h_f632d83e-2c2c-4c90-8fea-5747d58633d6/manager/0.log" Sep 30 14:54:58 crc kubenswrapper[4936]: I0930 14:54:58.951049 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-g5k7h_f632d83e-2c2c-4c90-8fea-5747d58633d6/kube-rbac-proxy/0.log" Sep 30 14:54:59 crc kubenswrapper[4936]: I0930 14:54:59.067928 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-9jjnj_13bd8563-ccb1-4445-b613-495e801195a4/kube-rbac-proxy/0.log" Sep 30 14:54:59 crc kubenswrapper[4936]: I0930 14:54:59.124542 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-9jjnj_13bd8563-ccb1-4445-b613-495e801195a4/manager/0.log" Sep 30 14:54:59 crc kubenswrapper[4936]: I0930 14:54:59.289726 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-hwpcz_7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6/kube-rbac-proxy/0.log" Sep 30 14:54:59 crc kubenswrapper[4936]: I0930 14:54:59.403399 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-hwpcz_7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6/manager/0.log" Sep 30 14:54:59 crc kubenswrapper[4936]: I0930 14:54:59.462211 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-n28ht_189d95a0-9ee0-4055-86c2-724082d46a11/kube-rbac-proxy/0.log" Sep 30 14:54:59 crc kubenswrapper[4936]: I0930 14:54:59.566784 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-n28ht_189d95a0-9ee0-4055-86c2-724082d46a11/manager/0.log" Sep 30 14:54:59 crc kubenswrapper[4936]: I0930 14:54:59.620415 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-hgdlx_c54f70b2-5767-4616-878b-5816861d2637/kube-rbac-proxy/0.log" Sep 30 14:54:59 crc kubenswrapper[4936]: I0930 14:54:59.719619 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-hgdlx_c54f70b2-5767-4616-878b-5816861d2637/manager/0.log" Sep 30 14:54:59 crc kubenswrapper[4936]: I0930 14:54:59.856986 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-sxftv_d08bae3c-64f1-46de-ab2c-d6b2407c2d95/kube-rbac-proxy/0.log" Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.009241 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-sxftv_d08bae3c-64f1-46de-ab2c-d6b2407c2d95/manager/0.log" Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.089473 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-8tpqq_847b4871-2d23-4790-b32a-b42698008fee/kube-rbac-proxy/0.log" Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.132242 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-8tpqq_847b4871-2d23-4790-b32a-b42698008fee/manager/0.log" Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.232344 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-2kldb_e546e1bb-9ee4-4549-9521-76d122b4edf5/kube-rbac-proxy/0.log" Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.283423 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cns66" podUID="bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed" containerName="registry-server" containerID="cri-o://5e58a49a04154cfa5f2b16a35af8098a235aeac30de2baa4fdfbf25485b02370" gracePeriod=2 Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.410619 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-2kldb_e546e1bb-9ee4-4549-9521-76d122b4edf5/manager/0.log" Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.515245 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-zvzfj_5ec29297-f0db-497d-aa05-e939e9aef380/kube-rbac-proxy/0.log" Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.692264 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-zvzfj_5ec29297-f0db-497d-aa05-e939e9aef380/manager/0.log" Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.695698 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-lhbld_d50d2534-deec-4173-a73a-d10b3beac452/kube-rbac-proxy/0.log" Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.760574 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-lhbld_d50d2534-deec-4173-a73a-d10b3beac452/manager/0.log" Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.773000 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cns66" Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.863364 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpgxh\" (UniqueName: \"kubernetes.io/projected/bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed-kube-api-access-cpgxh\") pod \"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed\" (UID: \"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed\") " Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.863615 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed-catalog-content\") pod \"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed\" (UID: \"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed\") " Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.863660 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed-utilities\") pod \"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed\" (UID: \"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed\") " Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.864688 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed-utilities" (OuterVolumeSpecName: "utilities") pod "bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed" (UID: "bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.869107 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed-kube-api-access-cpgxh" (OuterVolumeSpecName: "kube-api-access-cpgxh") pod "bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed" (UID: "bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed"). InnerVolumeSpecName "kube-api-access-cpgxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.925586 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed" (UID: "bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.966220 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpgxh\" (UniqueName: \"kubernetes.io/projected/bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed-kube-api-access-cpgxh\") on node \"crc\" DevicePath \"\"" Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.966255 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.966268 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.992570 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-5mdhl_b1538e13-4b0e-4bb9-9277-3d0475cd41a4/kube-rbac-proxy/0.log" Sep 30 14:55:00 crc kubenswrapper[4936]: I0930 14:55:00.992706 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-5mdhl_b1538e13-4b0e-4bb9-9277-3d0475cd41a4/manager/0.log" Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.151958 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-64c94699b9-lqnhn_b4e378e0-0a69-47c9-b80f-fee159c4ad5b/kube-rbac-proxy/0.log" Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.222864 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-69769bbb6-9mvrz_4ea53f04-2776-4e45-9444-6255d7fd2860/kube-rbac-proxy/0.log" Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.294842 4936 generic.go:334] "Generic (PLEG): container finished" podID="bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed" containerID="5e58a49a04154cfa5f2b16a35af8098a235aeac30de2baa4fdfbf25485b02370" exitCode=0 Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.295083 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cns66" event={"ID":"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed","Type":"ContainerDied","Data":"5e58a49a04154cfa5f2b16a35af8098a235aeac30de2baa4fdfbf25485b02370"} Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.295161 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cns66" event={"ID":"bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed","Type":"ContainerDied","Data":"fc25d1a387c6bad4f3c3a35da29b775f392a462ee3ace5ced2d66491709270d7"} Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.295227 4936 scope.go:117] "RemoveContainer" containerID="5e58a49a04154cfa5f2b16a35af8098a235aeac30de2baa4fdfbf25485b02370" Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.295447 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cns66" Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.319781 4936 scope.go:117] "RemoveContainer" containerID="81b5d7211dbcc968f216713297243b27f670af23bc0be4720018750be4d329e1" Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.347282 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cns66"] Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.353075 4936 scope.go:117] "RemoveContainer" containerID="d7c51836cd224a8ff737607e7da0b4bc16b3347b273fbfa68276d6bbb3ca75c7" Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.370482 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cns66"] Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.389189 4936 scope.go:117] "RemoveContainer" containerID="5e58a49a04154cfa5f2b16a35af8098a235aeac30de2baa4fdfbf25485b02370" Sep 30 14:55:01 crc kubenswrapper[4936]: E0930 14:55:01.389752 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e58a49a04154cfa5f2b16a35af8098a235aeac30de2baa4fdfbf25485b02370\": container with ID starting with 5e58a49a04154cfa5f2b16a35af8098a235aeac30de2baa4fdfbf25485b02370 not found: ID does not exist" containerID="5e58a49a04154cfa5f2b16a35af8098a235aeac30de2baa4fdfbf25485b02370" Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.389779 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e58a49a04154cfa5f2b16a35af8098a235aeac30de2baa4fdfbf25485b02370"} err="failed to get container status \"5e58a49a04154cfa5f2b16a35af8098a235aeac30de2baa4fdfbf25485b02370\": rpc error: code = NotFound desc = could not find container \"5e58a49a04154cfa5f2b16a35af8098a235aeac30de2baa4fdfbf25485b02370\": container with ID starting with 5e58a49a04154cfa5f2b16a35af8098a235aeac30de2baa4fdfbf25485b02370 not found: ID does not exist" Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.389798 4936 scope.go:117] "RemoveContainer" containerID="81b5d7211dbcc968f216713297243b27f670af23bc0be4720018750be4d329e1" Sep 30 14:55:01 crc kubenswrapper[4936]: E0930 14:55:01.400723 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81b5d7211dbcc968f216713297243b27f670af23bc0be4720018750be4d329e1\": container with ID starting with 81b5d7211dbcc968f216713297243b27f670af23bc0be4720018750be4d329e1 not found: ID does not exist" containerID="81b5d7211dbcc968f216713297243b27f670af23bc0be4720018750be4d329e1" Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.400760 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81b5d7211dbcc968f216713297243b27f670af23bc0be4720018750be4d329e1"} err="failed to get container status \"81b5d7211dbcc968f216713297243b27f670af23bc0be4720018750be4d329e1\": rpc error: code = NotFound desc = could not find container \"81b5d7211dbcc968f216713297243b27f670af23bc0be4720018750be4d329e1\": container with ID starting with 81b5d7211dbcc968f216713297243b27f670af23bc0be4720018750be4d329e1 not found: ID does not exist" Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.400803 4936 scope.go:117] "RemoveContainer" containerID="d7c51836cd224a8ff737607e7da0b4bc16b3347b273fbfa68276d6bbb3ca75c7" Sep 30 14:55:01 crc kubenswrapper[4936]: E0930 14:55:01.401412 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c51836cd224a8ff737607e7da0b4bc16b3347b273fbfa68276d6bbb3ca75c7\": container with ID starting with d7c51836cd224a8ff737607e7da0b4bc16b3347b273fbfa68276d6bbb3ca75c7 not found: ID does not exist" containerID="d7c51836cd224a8ff737607e7da0b4bc16b3347b273fbfa68276d6bbb3ca75c7" Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.401435 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c51836cd224a8ff737607e7da0b4bc16b3347b273fbfa68276d6bbb3ca75c7"} err="failed to get container status \"d7c51836cd224a8ff737607e7da0b4bc16b3347b273fbfa68276d6bbb3ca75c7\": rpc error: code = NotFound desc = could not find container \"d7c51836cd224a8ff737607e7da0b4bc16b3347b273fbfa68276d6bbb3ca75c7\": container with ID starting with d7c51836cd224a8ff737607e7da0b4bc16b3347b273fbfa68276d6bbb3ca75c7 not found: ID does not exist" Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.509964 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-69769bbb6-9mvrz_4ea53f04-2776-4e45-9444-6255d7fd2860/operator/0.log" Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.587468 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-b5lzr_6fc87ca3-ce0e-4976-b45f-cf28709a6f9f/registry-server/0.log" Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.785390 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-sthwj_85448d25-86e9-4a2f-bc5a-339ab3d2112a/kube-rbac-proxy/0.log" Sep 30 14:55:01 crc kubenswrapper[4936]: I0930 14:55:01.919252 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-sthwj_85448d25-86e9-4a2f-bc5a-339ab3d2112a/manager/0.log" Sep 30 14:55:02 crc kubenswrapper[4936]: I0930 14:55:02.084082 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-sxfzz_699f5243-7ea5-4f7f-a537-51a99a871ccb/kube-rbac-proxy/0.log" Sep 30 14:55:02 crc kubenswrapper[4936]: I0930 14:55:02.176573 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-sxfzz_699f5243-7ea5-4f7f-a537-51a99a871ccb/manager/0.log" Sep 30 14:55:02 crc kubenswrapper[4936]: I0930 14:55:02.352493 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed" path="/var/lib/kubelet/pods/bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed/volumes" Sep 30 14:55:02 crc kubenswrapper[4936]: I0930 14:55:02.514113 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-n6mv9_bbbf4ed1-241b-4c4e-80e0-77acd778b868/kube-rbac-proxy/0.log" Sep 30 14:55:02 crc kubenswrapper[4936]: I0930 14:55:02.515116 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-49bq2_d93cfe4f-caf4-4b23-9d9c-0aa14cb5bc28/operator/0.log" Sep 30 14:55:02 crc kubenswrapper[4936]: I0930 14:55:02.601771 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-n6mv9_bbbf4ed1-241b-4c4e-80e0-77acd778b868/manager/0.log" Sep 30 14:55:02 crc kubenswrapper[4936]: I0930 14:55:02.685546 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-64c94699b9-lqnhn_b4e378e0-0a69-47c9-b80f-fee159c4ad5b/manager/0.log" Sep 30 14:55:02 crc kubenswrapper[4936]: I0930 14:55:02.774426 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-7rfmc_78f55939-d8fc-40d5-bc8e-a3f87b962b34/kube-rbac-proxy/0.log" Sep 30 14:55:02 crc kubenswrapper[4936]: I0930 14:55:02.858415 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-7rfmc_78f55939-d8fc-40d5-bc8e-a3f87b962b34/manager/0.log" Sep 30 14:55:02 crc kubenswrapper[4936]: I0930 14:55:02.966303 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-x2csq_673847ae-740d-4a3b-ad7e-09ec8848199d/kube-rbac-proxy/0.log" Sep 30 14:55:03 crc kubenswrapper[4936]: I0930 14:55:03.146692 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-x2csq_673847ae-740d-4a3b-ad7e-09ec8848199d/manager/0.log" Sep 30 14:55:03 crc kubenswrapper[4936]: I0930 14:55:03.194043 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-d2tm5_639a60da-010a-40d5-bfec-6219ef3f712b/kube-rbac-proxy/0.log" Sep 30 14:55:03 crc kubenswrapper[4936]: I0930 14:55:03.201871 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-d2tm5_639a60da-010a-40d5-bfec-6219ef3f712b/manager/0.log" Sep 30 14:55:04 crc kubenswrapper[4936]: I0930 14:55:04.316191 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:55:04 crc kubenswrapper[4936]: E0930 14:55:04.316604 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:55:17 crc kubenswrapper[4936]: I0930 14:55:17.320215 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:55:17 crc kubenswrapper[4936]: E0930 14:55:17.320897 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 14:55:19 crc kubenswrapper[4936]: I0930 14:55:19.800571 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-6mxph_6d8ca06e-ea42-4678-8d28-dcd11b4dd1ce/control-plane-machine-set-operator/0.log" Sep 30 14:55:19 crc kubenswrapper[4936]: I0930 14:55:19.989700 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-rw728_27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30/kube-rbac-proxy/0.log" Sep 30 14:55:19 crc kubenswrapper[4936]: I0930 14:55:19.994879 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-rw728_27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30/machine-api-operator/0.log" Sep 30 14:55:31 crc kubenswrapper[4936]: I0930 14:55:31.315010 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:55:32 crc kubenswrapper[4936]: I0930 14:55:32.135417 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-wm65j_392d0573-afef-4492-9768-2d9c4830d7b8/cert-manager-controller/0.log" Sep 30 14:55:32 crc kubenswrapper[4936]: I0930 14:55:32.192465 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-xwsfb_b59b1137-6114-4d73-8593-250d0da0b741/cert-manager-cainjector/0.log" Sep 30 14:55:32 crc kubenswrapper[4936]: I0930 14:55:32.379159 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-z76zc_a9648b53-2a15-447e-bca5-87692ab32278/cert-manager-webhook/0.log" Sep 30 14:55:32 crc kubenswrapper[4936]: I0930 14:55:32.586100 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"703cf0e717c235b899f5fdee3e05487c1763c18996c91683b2bb28080783463e"} Sep 30 14:55:44 crc kubenswrapper[4936]: I0930 14:55:44.860950 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-ngb5v_f690b866-399e-4bf7-bed4-c261098bfbb1/nmstate-console-plugin/0.log" Sep 30 14:55:44 crc kubenswrapper[4936]: I0930 14:55:44.964733 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-lgvdl_6b791ba1-37e6-440f-899d-7db4972b74f5/nmstate-handler/0.log" Sep 30 14:55:45 crc kubenswrapper[4936]: I0930 14:55:45.076822 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-gj98k_e80b68f2-d68e-4499-84a4-8a83b18922c6/kube-rbac-proxy/0.log" Sep 30 14:55:45 crc kubenswrapper[4936]: I0930 14:55:45.096954 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-gj98k_e80b68f2-d68e-4499-84a4-8a83b18922c6/nmstate-metrics/0.log" Sep 30 14:55:45 crc kubenswrapper[4936]: I0930 14:55:45.272537 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-284mm_50cdc3bf-7d9e-4644-87b2-81e93c15174a/nmstate-webhook/0.log" Sep 30 14:55:45 crc kubenswrapper[4936]: I0930 14:55:45.288677 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-cln2v_4409f314-97f6-4776-980f-c7727fa7fd18/nmstate-operator/0.log" Sep 30 14:56:01 crc kubenswrapper[4936]: I0930 14:56:01.955991 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-pkkdt_63b433c1-ca17-4e41-9412-8c9abede7b39/controller/0.log" Sep 30 14:56:02 crc kubenswrapper[4936]: I0930 14:56:02.126417 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-pkkdt_63b433c1-ca17-4e41-9412-8c9abede7b39/kube-rbac-proxy/0.log" Sep 30 14:56:02 crc kubenswrapper[4936]: I0930 14:56:02.275561 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-frr-files/0.log" Sep 30 14:56:02 crc kubenswrapper[4936]: I0930 14:56:02.573893 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-frr-files/0.log" Sep 30 14:56:02 crc kubenswrapper[4936]: I0930 14:56:02.609667 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-reloader/0.log" Sep 30 14:56:02 crc kubenswrapper[4936]: I0930 14:56:02.653188 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-metrics/0.log" Sep 30 14:56:02 crc kubenswrapper[4936]: I0930 14:56:02.778808 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-reloader/0.log" Sep 30 14:56:02 crc kubenswrapper[4936]: I0930 14:56:02.870728 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-reloader/0.log" Sep 30 14:56:02 crc kubenswrapper[4936]: I0930 14:56:02.880223 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-frr-files/0.log" Sep 30 14:56:03 crc kubenswrapper[4936]: I0930 14:56:03.007670 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-metrics/0.log" Sep 30 14:56:03 crc kubenswrapper[4936]: I0930 14:56:03.092250 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-metrics/0.log" Sep 30 14:56:03 crc kubenswrapper[4936]: I0930 14:56:03.354175 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/controller/0.log" Sep 30 14:56:03 crc kubenswrapper[4936]: I0930 14:56:03.410706 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-metrics/0.log" Sep 30 14:56:03 crc kubenswrapper[4936]: I0930 14:56:03.431258 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-reloader/0.log" Sep 30 14:56:03 crc kubenswrapper[4936]: I0930 14:56:03.465790 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-frr-files/0.log" Sep 30 14:56:03 crc kubenswrapper[4936]: I0930 14:56:03.678992 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/kube-rbac-proxy/0.log" Sep 30 14:56:03 crc kubenswrapper[4936]: I0930 14:56:03.694017 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/frr-metrics/0.log" Sep 30 14:56:03 crc kubenswrapper[4936]: I0930 14:56:03.732275 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/kube-rbac-proxy-frr/0.log" Sep 30 14:56:03 crc kubenswrapper[4936]: I0930 14:56:03.921394 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/reloader/0.log" Sep 30 14:56:04 crc kubenswrapper[4936]: I0930 14:56:04.083115 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-chnlb_f304af2d-f6f0-4be9-8388-a81870af995f/frr-k8s-webhook-server/0.log" Sep 30 14:56:04 crc kubenswrapper[4936]: I0930 14:56:04.346241 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6fd76b6558-4gwpj_d4775401-fba2-4958-b075-6862db490e18/manager/0.log" Sep 30 14:56:04 crc kubenswrapper[4936]: I0930 14:56:04.528295 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-c67dfbd86-j4s2n_ab06cf8d-01c1-45c8-9c95-6f3369b8ef75/webhook-server/0.log" Sep 30 14:56:04 crc kubenswrapper[4936]: I0930 14:56:04.806973 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-24z6s_f673a383-44a6-4fe9-a432-f84341817e89/kube-rbac-proxy/0.log" Sep 30 14:56:04 crc kubenswrapper[4936]: I0930 14:56:04.951933 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/frr/0.log" Sep 30 14:56:05 crc kubenswrapper[4936]: I0930 14:56:05.465486 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-24z6s_f673a383-44a6-4fe9-a432-f84341817e89/speaker/0.log" Sep 30 14:56:19 crc kubenswrapper[4936]: I0930 14:56:19.687219 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76_9ece5e6c-214d-460e-bedf-19196f994946/util/0.log" Sep 30 14:56:19 crc kubenswrapper[4936]: I0930 14:56:19.932031 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76_9ece5e6c-214d-460e-bedf-19196f994946/pull/0.log" Sep 30 14:56:19 crc kubenswrapper[4936]: I0930 14:56:19.944056 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76_9ece5e6c-214d-460e-bedf-19196f994946/util/0.log" Sep 30 14:56:20 crc kubenswrapper[4936]: I0930 14:56:20.002927 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76_9ece5e6c-214d-460e-bedf-19196f994946/pull/0.log" Sep 30 14:56:20 crc kubenswrapper[4936]: I0930 14:56:20.125068 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76_9ece5e6c-214d-460e-bedf-19196f994946/util/0.log" Sep 30 14:56:20 crc kubenswrapper[4936]: I0930 14:56:20.166784 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76_9ece5e6c-214d-460e-bedf-19196f994946/extract/0.log" Sep 30 14:56:20 crc kubenswrapper[4936]: I0930 14:56:20.175986 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76_9ece5e6c-214d-460e-bedf-19196f994946/pull/0.log" Sep 30 14:56:20 crc kubenswrapper[4936]: I0930 14:56:20.344771 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6hgz_2ca06e96-23c1-4b90-a643-3e36b8df9443/extract-utilities/0.log" Sep 30 14:56:20 crc kubenswrapper[4936]: I0930 14:56:20.551574 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6hgz_2ca06e96-23c1-4b90-a643-3e36b8df9443/extract-content/0.log" Sep 30 14:56:20 crc kubenswrapper[4936]: I0930 14:56:20.554799 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6hgz_2ca06e96-23c1-4b90-a643-3e36b8df9443/extract-content/0.log" Sep 30 14:56:20 crc kubenswrapper[4936]: I0930 14:56:20.592720 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6hgz_2ca06e96-23c1-4b90-a643-3e36b8df9443/extract-utilities/0.log" Sep 30 14:56:20 crc kubenswrapper[4936]: I0930 14:56:20.711536 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6hgz_2ca06e96-23c1-4b90-a643-3e36b8df9443/extract-content/0.log" Sep 30 14:56:20 crc kubenswrapper[4936]: I0930 14:56:20.781793 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6hgz_2ca06e96-23c1-4b90-a643-3e36b8df9443/extract-utilities/0.log" Sep 30 14:56:20 crc kubenswrapper[4936]: I0930 14:56:20.917303 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kljrf_5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0/extract-utilities/0.log" Sep 30 14:56:21 crc kubenswrapper[4936]: I0930 14:56:21.216674 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kljrf_5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0/extract-utilities/0.log" Sep 30 14:56:21 crc kubenswrapper[4936]: I0930 14:56:21.316551 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kljrf_5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0/extract-content/0.log" Sep 30 14:56:21 crc kubenswrapper[4936]: I0930 14:56:21.352471 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6hgz_2ca06e96-23c1-4b90-a643-3e36b8df9443/registry-server/0.log" Sep 30 14:56:21 crc kubenswrapper[4936]: I0930 14:56:21.359346 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kljrf_5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0/extract-content/0.log" Sep 30 14:56:21 crc kubenswrapper[4936]: I0930 14:56:21.587130 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kljrf_5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0/extract-content/0.log" Sep 30 14:56:21 crc kubenswrapper[4936]: I0930 14:56:21.601870 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kljrf_5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0/extract-utilities/0.log" Sep 30 14:56:21 crc kubenswrapper[4936]: I0930 14:56:21.972542 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb_2b7261a1-f326-4692-ac33-cef53002b4eb/util/0.log" Sep 30 14:56:22 crc kubenswrapper[4936]: I0930 14:56:22.391832 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kljrf_5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0/registry-server/0.log" Sep 30 14:56:22 crc kubenswrapper[4936]: I0930 14:56:22.412035 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb_2b7261a1-f326-4692-ac33-cef53002b4eb/pull/0.log" Sep 30 14:56:22 crc kubenswrapper[4936]: I0930 14:56:22.452590 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb_2b7261a1-f326-4692-ac33-cef53002b4eb/util/0.log" Sep 30 14:56:22 crc kubenswrapper[4936]: I0930 14:56:22.475273 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb_2b7261a1-f326-4692-ac33-cef53002b4eb/pull/0.log" Sep 30 14:56:22 crc kubenswrapper[4936]: I0930 14:56:22.700789 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb_2b7261a1-f326-4692-ac33-cef53002b4eb/extract/0.log" Sep 30 14:56:22 crc kubenswrapper[4936]: I0930 14:56:22.712548 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb_2b7261a1-f326-4692-ac33-cef53002b4eb/util/0.log" Sep 30 14:56:22 crc kubenswrapper[4936]: I0930 14:56:22.798062 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb_2b7261a1-f326-4692-ac33-cef53002b4eb/pull/0.log" Sep 30 14:56:23 crc kubenswrapper[4936]: I0930 14:56:23.022225 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-q587x_18fdb3dd-ed9e-4625-9bb8-7f2a079396dd/marketplace-operator/0.log" Sep 30 14:56:23 crc kubenswrapper[4936]: I0930 14:56:23.133951 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f4nkx_e48746c3-7005-4672-a536-f6b419f168fc/extract-utilities/0.log" Sep 30 14:56:23 crc kubenswrapper[4936]: I0930 14:56:23.435627 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f4nkx_e48746c3-7005-4672-a536-f6b419f168fc/extract-utilities/0.log" Sep 30 14:56:23 crc kubenswrapper[4936]: I0930 14:56:23.436175 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f4nkx_e48746c3-7005-4672-a536-f6b419f168fc/extract-content/0.log" Sep 30 14:56:23 crc kubenswrapper[4936]: I0930 14:56:23.438974 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f4nkx_e48746c3-7005-4672-a536-f6b419f168fc/extract-content/0.log" Sep 30 14:56:23 crc kubenswrapper[4936]: I0930 14:56:23.653917 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f4nkx_e48746c3-7005-4672-a536-f6b419f168fc/extract-utilities/0.log" Sep 30 14:56:23 crc kubenswrapper[4936]: I0930 14:56:23.684625 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k8t8h_d779fd94-9fb2-4bd0-95c3-d7ac8885b589/extract-utilities/0.log" Sep 30 14:56:23 crc kubenswrapper[4936]: I0930 14:56:23.715228 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f4nkx_e48746c3-7005-4672-a536-f6b419f168fc/extract-content/0.log" Sep 30 14:56:23 crc kubenswrapper[4936]: I0930 14:56:23.843276 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f4nkx_e48746c3-7005-4672-a536-f6b419f168fc/registry-server/0.log" Sep 30 14:56:23 crc kubenswrapper[4936]: I0930 14:56:23.905667 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k8t8h_d779fd94-9fb2-4bd0-95c3-d7ac8885b589/extract-utilities/0.log" Sep 30 14:56:23 crc kubenswrapper[4936]: I0930 14:56:23.958899 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k8t8h_d779fd94-9fb2-4bd0-95c3-d7ac8885b589/extract-content/0.log" Sep 30 14:56:23 crc kubenswrapper[4936]: I0930 14:56:23.997926 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k8t8h_d779fd94-9fb2-4bd0-95c3-d7ac8885b589/extract-content/0.log" Sep 30 14:56:24 crc kubenswrapper[4936]: I0930 14:56:24.155423 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k8t8h_d779fd94-9fb2-4bd0-95c3-d7ac8885b589/extract-utilities/0.log" Sep 30 14:56:24 crc kubenswrapper[4936]: I0930 14:56:24.187564 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k8t8h_d779fd94-9fb2-4bd0-95c3-d7ac8885b589/extract-content/0.log" Sep 30 14:56:24 crc kubenswrapper[4936]: I0930 14:56:24.495317 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k8t8h_d779fd94-9fb2-4bd0-95c3-d7ac8885b589/registry-server/0.log" Sep 30 14:57:48 crc kubenswrapper[4936]: I0930 14:57:48.250098 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:57:48 crc kubenswrapper[4936]: I0930 14:57:48.250685 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:57:57 crc kubenswrapper[4936]: I0930 14:57:57.992478 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-svkr7"] Sep 30 14:57:57 crc kubenswrapper[4936]: E0930 14:57:57.993524 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed" containerName="registry-server" Sep 30 14:57:57 crc kubenswrapper[4936]: I0930 14:57:57.993542 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed" containerName="registry-server" Sep 30 14:57:57 crc kubenswrapper[4936]: E0930 14:57:57.993576 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed" containerName="extract-content" Sep 30 14:57:57 crc kubenswrapper[4936]: I0930 14:57:57.993585 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed" containerName="extract-content" Sep 30 14:57:57 crc kubenswrapper[4936]: E0930 14:57:57.993608 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed" containerName="extract-utilities" Sep 30 14:57:57 crc kubenswrapper[4936]: I0930 14:57:57.993616 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed" containerName="extract-utilities" Sep 30 14:57:57 crc kubenswrapper[4936]: E0930 14:57:57.993642 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a524b2df-691f-428c-8011-889c25d874c0" containerName="container-00" Sep 30 14:57:57 crc kubenswrapper[4936]: I0930 14:57:57.993649 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="a524b2df-691f-428c-8011-889c25d874c0" containerName="container-00" Sep 30 14:57:57 crc kubenswrapper[4936]: I0930 14:57:57.993876 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf9a5fc3-ce8f-4bec-bd44-fa429d3cbfed" containerName="registry-server" Sep 30 14:57:57 crc kubenswrapper[4936]: I0930 14:57:57.993891 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="a524b2df-691f-428c-8011-889c25d874c0" containerName="container-00" Sep 30 14:57:57 crc kubenswrapper[4936]: I0930 14:57:57.996687 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-svkr7" Sep 30 14:57:58 crc kubenswrapper[4936]: I0930 14:57:58.024541 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-svkr7"] Sep 30 14:57:58 crc kubenswrapper[4936]: I0930 14:57:58.073126 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3-utilities\") pod \"redhat-operators-svkr7\" (UID: \"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3\") " pod="openshift-marketplace/redhat-operators-svkr7" Sep 30 14:57:58 crc kubenswrapper[4936]: I0930 14:57:58.073260 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3-catalog-content\") pod \"redhat-operators-svkr7\" (UID: \"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3\") " pod="openshift-marketplace/redhat-operators-svkr7" Sep 30 14:57:58 crc kubenswrapper[4936]: I0930 14:57:58.073308 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf5w2\" (UniqueName: \"kubernetes.io/projected/c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3-kube-api-access-qf5w2\") pod \"redhat-operators-svkr7\" (UID: \"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3\") " pod="openshift-marketplace/redhat-operators-svkr7" Sep 30 14:57:58 crc kubenswrapper[4936]: I0930 14:57:58.175683 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3-catalog-content\") pod \"redhat-operators-svkr7\" (UID: \"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3\") " pod="openshift-marketplace/redhat-operators-svkr7" Sep 30 14:57:58 crc kubenswrapper[4936]: I0930 14:57:58.175802 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf5w2\" (UniqueName: \"kubernetes.io/projected/c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3-kube-api-access-qf5w2\") pod \"redhat-operators-svkr7\" (UID: \"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3\") " pod="openshift-marketplace/redhat-operators-svkr7" Sep 30 14:57:58 crc kubenswrapper[4936]: I0930 14:57:58.175946 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3-utilities\") pod \"redhat-operators-svkr7\" (UID: \"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3\") " pod="openshift-marketplace/redhat-operators-svkr7" Sep 30 14:57:58 crc kubenswrapper[4936]: I0930 14:57:58.176354 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3-catalog-content\") pod \"redhat-operators-svkr7\" (UID: \"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3\") " pod="openshift-marketplace/redhat-operators-svkr7" Sep 30 14:57:58 crc kubenswrapper[4936]: I0930 14:57:58.176391 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3-utilities\") pod \"redhat-operators-svkr7\" (UID: \"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3\") " pod="openshift-marketplace/redhat-operators-svkr7" Sep 30 14:57:58 crc kubenswrapper[4936]: I0930 14:57:58.202184 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf5w2\" (UniqueName: \"kubernetes.io/projected/c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3-kube-api-access-qf5w2\") pod \"redhat-operators-svkr7\" (UID: \"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3\") " pod="openshift-marketplace/redhat-operators-svkr7" Sep 30 14:57:58 crc kubenswrapper[4936]: I0930 14:57:58.320868 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-svkr7" Sep 30 14:57:58 crc kubenswrapper[4936]: I0930 14:57:58.935133 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-svkr7"] Sep 30 14:57:59 crc kubenswrapper[4936]: I0930 14:57:59.963270 4936 generic.go:334] "Generic (PLEG): container finished" podID="c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3" containerID="ae9cb38be6e0a88c514cc66e21d489650b1f021f106c3c5a7b4a70dff9931833" exitCode=0 Sep 30 14:57:59 crc kubenswrapper[4936]: I0930 14:57:59.963527 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svkr7" event={"ID":"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3","Type":"ContainerDied","Data":"ae9cb38be6e0a88c514cc66e21d489650b1f021f106c3c5a7b4a70dff9931833"} Sep 30 14:57:59 crc kubenswrapper[4936]: I0930 14:57:59.963629 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svkr7" event={"ID":"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3","Type":"ContainerStarted","Data":"d879f8b193aa424f408a55eced82200b4b9a1d7e88f118ebda0efefb4e1db753"} Sep 30 14:57:59 crc kubenswrapper[4936]: I0930 14:57:59.966630 4936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:58:00 crc kubenswrapper[4936]: I0930 14:58:00.972630 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svkr7" event={"ID":"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3","Type":"ContainerStarted","Data":"681e6dee5b6659a7b9149af029afe2e0e78cd2e87348b0bdd2b441b6b8b6c5bd"} Sep 30 14:58:05 crc kubenswrapper[4936]: I0930 14:58:05.013118 4936 generic.go:334] "Generic (PLEG): container finished" podID="c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3" containerID="681e6dee5b6659a7b9149af029afe2e0e78cd2e87348b0bdd2b441b6b8b6c5bd" exitCode=0 Sep 30 14:58:05 crc kubenswrapper[4936]: I0930 14:58:05.013212 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svkr7" event={"ID":"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3","Type":"ContainerDied","Data":"681e6dee5b6659a7b9149af029afe2e0e78cd2e87348b0bdd2b441b6b8b6c5bd"} Sep 30 14:58:06 crc kubenswrapper[4936]: I0930 14:58:06.025901 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svkr7" event={"ID":"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3","Type":"ContainerStarted","Data":"97cf863070b00432a966bb09837b74ff0039098b1ac1e1646ccc049a172a6082"} Sep 30 14:58:06 crc kubenswrapper[4936]: I0930 14:58:06.048642 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-svkr7" podStartSLOduration=3.530596977 podStartE2EDuration="9.048626442s" podCreationTimestamp="2025-09-30 14:57:57 +0000 UTC" firstStartedPulling="2025-09-30 14:57:59.966363481 +0000 UTC m=+4730.350365782" lastFinishedPulling="2025-09-30 14:58:05.484392946 +0000 UTC m=+4735.868395247" observedRunningTime="2025-09-30 14:58:06.048598441 +0000 UTC m=+4736.432600742" watchObservedRunningTime="2025-09-30 14:58:06.048626442 +0000 UTC m=+4736.432628743" Sep 30 14:58:08 crc kubenswrapper[4936]: I0930 14:58:08.325922 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-svkr7" Sep 30 14:58:08 crc kubenswrapper[4936]: I0930 14:58:08.326588 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-svkr7" Sep 30 14:58:09 crc kubenswrapper[4936]: I0930 14:58:09.370774 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-svkr7" podUID="c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3" containerName="registry-server" probeResult="failure" output=< Sep 30 14:58:09 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 14:58:09 crc kubenswrapper[4936]: > Sep 30 14:58:18 crc kubenswrapper[4936]: I0930 14:58:18.249779 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:58:18 crc kubenswrapper[4936]: I0930 14:58:18.250404 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:58:19 crc kubenswrapper[4936]: I0930 14:58:19.381489 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-svkr7" podUID="c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3" containerName="registry-server" probeResult="failure" output=< Sep 30 14:58:19 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 14:58:19 crc kubenswrapper[4936]: > Sep 30 14:58:28 crc kubenswrapper[4936]: I0930 14:58:28.377995 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-svkr7" Sep 30 14:58:28 crc kubenswrapper[4936]: I0930 14:58:28.433253 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-svkr7" Sep 30 14:58:29 crc kubenswrapper[4936]: I0930 14:58:29.192986 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-svkr7"] Sep 30 14:58:30 crc kubenswrapper[4936]: I0930 14:58:30.243800 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-svkr7" podUID="c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3" containerName="registry-server" containerID="cri-o://97cf863070b00432a966bb09837b74ff0039098b1ac1e1646ccc049a172a6082" gracePeriod=2 Sep 30 14:58:30 crc kubenswrapper[4936]: I0930 14:58:30.731661 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-svkr7" Sep 30 14:58:30 crc kubenswrapper[4936]: I0930 14:58:30.799039 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3-catalog-content\") pod \"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3\" (UID: \"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3\") " Sep 30 14:58:30 crc kubenswrapper[4936]: I0930 14:58:30.799200 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3-utilities\") pod \"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3\" (UID: \"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3\") " Sep 30 14:58:30 crc kubenswrapper[4936]: I0930 14:58:30.799316 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf5w2\" (UniqueName: \"kubernetes.io/projected/c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3-kube-api-access-qf5w2\") pod \"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3\" (UID: \"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3\") " Sep 30 14:58:30 crc kubenswrapper[4936]: I0930 14:58:30.801801 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3-utilities" (OuterVolumeSpecName: "utilities") pod "c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3" (UID: "c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:58:30 crc kubenswrapper[4936]: I0930 14:58:30.806239 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3-kube-api-access-qf5w2" (OuterVolumeSpecName: "kube-api-access-qf5w2") pod "c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3" (UID: "c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3"). InnerVolumeSpecName "kube-api-access-qf5w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:58:30 crc kubenswrapper[4936]: I0930 14:58:30.901322 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3" (UID: "c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:58:30 crc kubenswrapper[4936]: I0930 14:58:30.901582 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:58:30 crc kubenswrapper[4936]: I0930 14:58:30.901604 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:58:30 crc kubenswrapper[4936]: I0930 14:58:30.901615 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf5w2\" (UniqueName: \"kubernetes.io/projected/c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3-kube-api-access-qf5w2\") on node \"crc\" DevicePath \"\"" Sep 30 14:58:31 crc kubenswrapper[4936]: I0930 14:58:31.255305 4936 generic.go:334] "Generic (PLEG): container finished" podID="c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3" containerID="97cf863070b00432a966bb09837b74ff0039098b1ac1e1646ccc049a172a6082" exitCode=0 Sep 30 14:58:31 crc kubenswrapper[4936]: I0930 14:58:31.255388 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svkr7" event={"ID":"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3","Type":"ContainerDied","Data":"97cf863070b00432a966bb09837b74ff0039098b1ac1e1646ccc049a172a6082"} Sep 30 14:58:31 crc kubenswrapper[4936]: I0930 14:58:31.255400 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-svkr7" Sep 30 14:58:31 crc kubenswrapper[4936]: I0930 14:58:31.256751 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svkr7" event={"ID":"c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3","Type":"ContainerDied","Data":"d879f8b193aa424f408a55eced82200b4b9a1d7e88f118ebda0efefb4e1db753"} Sep 30 14:58:31 crc kubenswrapper[4936]: I0930 14:58:31.256841 4936 scope.go:117] "RemoveContainer" containerID="97cf863070b00432a966bb09837b74ff0039098b1ac1e1646ccc049a172a6082" Sep 30 14:58:31 crc kubenswrapper[4936]: I0930 14:58:31.277936 4936 scope.go:117] "RemoveContainer" containerID="681e6dee5b6659a7b9149af029afe2e0e78cd2e87348b0bdd2b441b6b8b6c5bd" Sep 30 14:58:31 crc kubenswrapper[4936]: I0930 14:58:31.298098 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-svkr7"] Sep 30 14:58:31 crc kubenswrapper[4936]: I0930 14:58:31.304500 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-svkr7"] Sep 30 14:58:31 crc kubenswrapper[4936]: I0930 14:58:31.311798 4936 scope.go:117] "RemoveContainer" containerID="ae9cb38be6e0a88c514cc66e21d489650b1f021f106c3c5a7b4a70dff9931833" Sep 30 14:58:31 crc kubenswrapper[4936]: I0930 14:58:31.349311 4936 scope.go:117] "RemoveContainer" containerID="97cf863070b00432a966bb09837b74ff0039098b1ac1e1646ccc049a172a6082" Sep 30 14:58:31 crc kubenswrapper[4936]: E0930 14:58:31.349602 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97cf863070b00432a966bb09837b74ff0039098b1ac1e1646ccc049a172a6082\": container with ID starting with 97cf863070b00432a966bb09837b74ff0039098b1ac1e1646ccc049a172a6082 not found: ID does not exist" containerID="97cf863070b00432a966bb09837b74ff0039098b1ac1e1646ccc049a172a6082" Sep 30 14:58:31 crc kubenswrapper[4936]: I0930 14:58:31.349631 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97cf863070b00432a966bb09837b74ff0039098b1ac1e1646ccc049a172a6082"} err="failed to get container status \"97cf863070b00432a966bb09837b74ff0039098b1ac1e1646ccc049a172a6082\": rpc error: code = NotFound desc = could not find container \"97cf863070b00432a966bb09837b74ff0039098b1ac1e1646ccc049a172a6082\": container with ID starting with 97cf863070b00432a966bb09837b74ff0039098b1ac1e1646ccc049a172a6082 not found: ID does not exist" Sep 30 14:58:31 crc kubenswrapper[4936]: I0930 14:58:31.349653 4936 scope.go:117] "RemoveContainer" containerID="681e6dee5b6659a7b9149af029afe2e0e78cd2e87348b0bdd2b441b6b8b6c5bd" Sep 30 14:58:31 crc kubenswrapper[4936]: E0930 14:58:31.349878 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"681e6dee5b6659a7b9149af029afe2e0e78cd2e87348b0bdd2b441b6b8b6c5bd\": container with ID starting with 681e6dee5b6659a7b9149af029afe2e0e78cd2e87348b0bdd2b441b6b8b6c5bd not found: ID does not exist" containerID="681e6dee5b6659a7b9149af029afe2e0e78cd2e87348b0bdd2b441b6b8b6c5bd" Sep 30 14:58:31 crc kubenswrapper[4936]: I0930 14:58:31.349903 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"681e6dee5b6659a7b9149af029afe2e0e78cd2e87348b0bdd2b441b6b8b6c5bd"} err="failed to get container status \"681e6dee5b6659a7b9149af029afe2e0e78cd2e87348b0bdd2b441b6b8b6c5bd\": rpc error: code = NotFound desc = could not find container \"681e6dee5b6659a7b9149af029afe2e0e78cd2e87348b0bdd2b441b6b8b6c5bd\": container with ID starting with 681e6dee5b6659a7b9149af029afe2e0e78cd2e87348b0bdd2b441b6b8b6c5bd not found: ID does not exist" Sep 30 14:58:31 crc kubenswrapper[4936]: I0930 14:58:31.349916 4936 scope.go:117] "RemoveContainer" containerID="ae9cb38be6e0a88c514cc66e21d489650b1f021f106c3c5a7b4a70dff9931833" Sep 30 14:58:31 crc kubenswrapper[4936]: E0930 14:58:31.350110 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae9cb38be6e0a88c514cc66e21d489650b1f021f106c3c5a7b4a70dff9931833\": container with ID starting with ae9cb38be6e0a88c514cc66e21d489650b1f021f106c3c5a7b4a70dff9931833 not found: ID does not exist" containerID="ae9cb38be6e0a88c514cc66e21d489650b1f021f106c3c5a7b4a70dff9931833" Sep 30 14:58:31 crc kubenswrapper[4936]: I0930 14:58:31.350133 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae9cb38be6e0a88c514cc66e21d489650b1f021f106c3c5a7b4a70dff9931833"} err="failed to get container status \"ae9cb38be6e0a88c514cc66e21d489650b1f021f106c3c5a7b4a70dff9931833\": rpc error: code = NotFound desc = could not find container \"ae9cb38be6e0a88c514cc66e21d489650b1f021f106c3c5a7b4a70dff9931833\": container with ID starting with ae9cb38be6e0a88c514cc66e21d489650b1f021f106c3c5a7b4a70dff9931833 not found: ID does not exist" Sep 30 14:58:32 crc kubenswrapper[4936]: I0930 14:58:32.326778 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3" path="/var/lib/kubelet/pods/c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3/volumes" Sep 30 14:58:42 crc kubenswrapper[4936]: I0930 14:58:42.156571 4936 scope.go:117] "RemoveContainer" containerID="4ca6363ccf92f24196bfdf6d22a02543bd59982d239ec49ae1b8cd32d4ee8855" Sep 30 14:58:48 crc kubenswrapper[4936]: I0930 14:58:48.250441 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:58:48 crc kubenswrapper[4936]: I0930 14:58:48.251039 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:58:48 crc kubenswrapper[4936]: I0930 14:58:48.251089 4936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 14:58:48 crc kubenswrapper[4936]: I0930 14:58:48.251832 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"703cf0e717c235b899f5fdee3e05487c1763c18996c91683b2bb28080783463e"} pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:58:48 crc kubenswrapper[4936]: I0930 14:58:48.251875 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" containerID="cri-o://703cf0e717c235b899f5fdee3e05487c1763c18996c91683b2bb28080783463e" gracePeriod=600 Sep 30 14:58:48 crc kubenswrapper[4936]: I0930 14:58:48.416552 4936 generic.go:334] "Generic (PLEG): container finished" podID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerID="703cf0e717c235b899f5fdee3e05487c1763c18996c91683b2bb28080783463e" exitCode=0 Sep 30 14:58:48 crc kubenswrapper[4936]: I0930 14:58:48.416606 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerDied","Data":"703cf0e717c235b899f5fdee3e05487c1763c18996c91683b2bb28080783463e"} Sep 30 14:58:48 crc kubenswrapper[4936]: I0930 14:58:48.416645 4936 scope.go:117] "RemoveContainer" containerID="3c3725ff9aeceb0ef2becb5f8c9dc7c45a1c52262891ea911a0e0defea452f72" Sep 30 14:58:49 crc kubenswrapper[4936]: I0930 14:58:49.430728 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9"} Sep 30 14:59:07 crc kubenswrapper[4936]: I0930 14:59:07.596699 4936 generic.go:334] "Generic (PLEG): container finished" podID="bb09b42b-b116-4eaf-91f9-acbb3f472edc" containerID="669c3bc5bd85483968723725e7849b76776d3f63a19c482e2e67e5ae15fa28d5" exitCode=0 Sep 30 14:59:07 crc kubenswrapper[4936]: I0930 14:59:07.596792 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmvv9/must-gather-wf5rj" event={"ID":"bb09b42b-b116-4eaf-91f9-acbb3f472edc","Type":"ContainerDied","Data":"669c3bc5bd85483968723725e7849b76776d3f63a19c482e2e67e5ae15fa28d5"} Sep 30 14:59:07 crc kubenswrapper[4936]: I0930 14:59:07.597855 4936 scope.go:117] "RemoveContainer" containerID="669c3bc5bd85483968723725e7849b76776d3f63a19c482e2e67e5ae15fa28d5" Sep 30 14:59:07 crc kubenswrapper[4936]: I0930 14:59:07.933430 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wmvv9_must-gather-wf5rj_bb09b42b-b116-4eaf-91f9-acbb3f472edc/gather/0.log" Sep 30 14:59:17 crc kubenswrapper[4936]: I0930 14:59:17.365846 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wmvv9/must-gather-wf5rj"] Sep 30 14:59:17 crc kubenswrapper[4936]: I0930 14:59:17.366644 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-wmvv9/must-gather-wf5rj" podUID="bb09b42b-b116-4eaf-91f9-acbb3f472edc" containerName="copy" containerID="cri-o://cfd86fa9fdeeb2199192c4d84171105091af0558492a75c8f6ade580bf7bdccf" gracePeriod=2 Sep 30 14:59:17 crc kubenswrapper[4936]: I0930 14:59:17.379068 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wmvv9/must-gather-wf5rj"] Sep 30 14:59:17 crc kubenswrapper[4936]: I0930 14:59:17.691471 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wmvv9_must-gather-wf5rj_bb09b42b-b116-4eaf-91f9-acbb3f472edc/copy/0.log" Sep 30 14:59:17 crc kubenswrapper[4936]: I0930 14:59:17.692952 4936 generic.go:334] "Generic (PLEG): container finished" podID="bb09b42b-b116-4eaf-91f9-acbb3f472edc" containerID="cfd86fa9fdeeb2199192c4d84171105091af0558492a75c8f6ade580bf7bdccf" exitCode=143 Sep 30 14:59:17 crc kubenswrapper[4936]: I0930 14:59:17.813072 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wmvv9_must-gather-wf5rj_bb09b42b-b116-4eaf-91f9-acbb3f472edc/copy/0.log" Sep 30 14:59:17 crc kubenswrapper[4936]: I0930 14:59:17.814376 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmvv9/must-gather-wf5rj" Sep 30 14:59:17 crc kubenswrapper[4936]: I0930 14:59:17.993888 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kp7m\" (UniqueName: \"kubernetes.io/projected/bb09b42b-b116-4eaf-91f9-acbb3f472edc-kube-api-access-6kp7m\") pod \"bb09b42b-b116-4eaf-91f9-acbb3f472edc\" (UID: \"bb09b42b-b116-4eaf-91f9-acbb3f472edc\") " Sep 30 14:59:17 crc kubenswrapper[4936]: I0930 14:59:17.993958 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bb09b42b-b116-4eaf-91f9-acbb3f472edc-must-gather-output\") pod \"bb09b42b-b116-4eaf-91f9-acbb3f472edc\" (UID: \"bb09b42b-b116-4eaf-91f9-acbb3f472edc\") " Sep 30 14:59:18 crc kubenswrapper[4936]: I0930 14:59:18.001311 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb09b42b-b116-4eaf-91f9-acbb3f472edc-kube-api-access-6kp7m" (OuterVolumeSpecName: "kube-api-access-6kp7m") pod "bb09b42b-b116-4eaf-91f9-acbb3f472edc" (UID: "bb09b42b-b116-4eaf-91f9-acbb3f472edc"). InnerVolumeSpecName "kube-api-access-6kp7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:59:18 crc kubenswrapper[4936]: I0930 14:59:18.096525 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kp7m\" (UniqueName: \"kubernetes.io/projected/bb09b42b-b116-4eaf-91f9-acbb3f472edc-kube-api-access-6kp7m\") on node \"crc\" DevicePath \"\"" Sep 30 14:59:18 crc kubenswrapper[4936]: I0930 14:59:18.175860 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb09b42b-b116-4eaf-91f9-acbb3f472edc-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "bb09b42b-b116-4eaf-91f9-acbb3f472edc" (UID: "bb09b42b-b116-4eaf-91f9-acbb3f472edc"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:59:18 crc kubenswrapper[4936]: I0930 14:59:18.199188 4936 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bb09b42b-b116-4eaf-91f9-acbb3f472edc-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 14:59:18 crc kubenswrapper[4936]: I0930 14:59:18.342680 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb09b42b-b116-4eaf-91f9-acbb3f472edc" path="/var/lib/kubelet/pods/bb09b42b-b116-4eaf-91f9-acbb3f472edc/volumes" Sep 30 14:59:18 crc kubenswrapper[4936]: I0930 14:59:18.701571 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wmvv9_must-gather-wf5rj_bb09b42b-b116-4eaf-91f9-acbb3f472edc/copy/0.log" Sep 30 14:59:18 crc kubenswrapper[4936]: I0930 14:59:18.702042 4936 scope.go:117] "RemoveContainer" containerID="cfd86fa9fdeeb2199192c4d84171105091af0558492a75c8f6ade580bf7bdccf" Sep 30 14:59:18 crc kubenswrapper[4936]: I0930 14:59:18.702270 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmvv9/must-gather-wf5rj" Sep 30 14:59:18 crc kubenswrapper[4936]: I0930 14:59:18.728024 4936 scope.go:117] "RemoveContainer" containerID="669c3bc5bd85483968723725e7849b76776d3f63a19c482e2e67e5ae15fa28d5" Sep 30 14:59:42 crc kubenswrapper[4936]: I0930 14:59:42.218513 4936 scope.go:117] "RemoveContainer" containerID="bec5befcf72266937665f8207d96646d247e1a4d8236063c597ef187ad4523bc" Sep 30 14:59:42 crc kubenswrapper[4936]: I0930 14:59:42.249930 4936 scope.go:117] "RemoveContainer" containerID="60b772266b390a227840fe27cd9882e4014eb2034c28d5c50e0a69813fc1a68e" Sep 30 14:59:42 crc kubenswrapper[4936]: I0930 14:59:42.305489 4936 scope.go:117] "RemoveContainer" containerID="36177f893736dac29c8c3bbe7766f535fed00c75e7f00f7d4cc044ca154acdea" Sep 30 14:59:57 crc kubenswrapper[4936]: I0930 14:59:57.741671 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-764s8/must-gather-5wk72"] Sep 30 14:59:57 crc kubenswrapper[4936]: E0930 14:59:57.753024 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb09b42b-b116-4eaf-91f9-acbb3f472edc" containerName="gather" Sep 30 14:59:57 crc kubenswrapper[4936]: I0930 14:59:57.753059 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb09b42b-b116-4eaf-91f9-acbb3f472edc" containerName="gather" Sep 30 14:59:57 crc kubenswrapper[4936]: E0930 14:59:57.753072 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3" containerName="registry-server" Sep 30 14:59:57 crc kubenswrapper[4936]: I0930 14:59:57.753080 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3" containerName="registry-server" Sep 30 14:59:57 crc kubenswrapper[4936]: E0930 14:59:57.753095 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb09b42b-b116-4eaf-91f9-acbb3f472edc" containerName="copy" Sep 30 14:59:57 crc kubenswrapper[4936]: I0930 14:59:57.753103 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb09b42b-b116-4eaf-91f9-acbb3f472edc" containerName="copy" Sep 30 14:59:57 crc kubenswrapper[4936]: E0930 14:59:57.753126 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3" containerName="extract-content" Sep 30 14:59:57 crc kubenswrapper[4936]: I0930 14:59:57.753134 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3" containerName="extract-content" Sep 30 14:59:57 crc kubenswrapper[4936]: E0930 14:59:57.753168 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3" containerName="extract-utilities" Sep 30 14:59:57 crc kubenswrapper[4936]: I0930 14:59:57.753175 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3" containerName="extract-utilities" Sep 30 14:59:57 crc kubenswrapper[4936]: I0930 14:59:57.753400 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb09b42b-b116-4eaf-91f9-acbb3f472edc" containerName="gather" Sep 30 14:59:57 crc kubenswrapper[4936]: I0930 14:59:57.753425 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2c38e3d-1863-4fad-92e6-ef6f1a2a03f3" containerName="registry-server" Sep 30 14:59:57 crc kubenswrapper[4936]: I0930 14:59:57.753446 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb09b42b-b116-4eaf-91f9-acbb3f472edc" containerName="copy" Sep 30 14:59:57 crc kubenswrapper[4936]: I0930 14:59:57.754600 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-764s8/must-gather-5wk72"] Sep 30 14:59:57 crc kubenswrapper[4936]: I0930 14:59:57.754699 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-764s8/must-gather-5wk72" Sep 30 14:59:57 crc kubenswrapper[4936]: I0930 14:59:57.758411 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-764s8"/"kube-root-ca.crt" Sep 30 14:59:57 crc kubenswrapper[4936]: I0930 14:59:57.759456 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-764s8"/"openshift-service-ca.crt" Sep 30 14:59:57 crc kubenswrapper[4936]: I0930 14:59:57.892523 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k78w\" (UniqueName: \"kubernetes.io/projected/7f46ad41-3b51-4de2-a9f4-f1200d6d85c5-kube-api-access-7k78w\") pod \"must-gather-5wk72\" (UID: \"7f46ad41-3b51-4de2-a9f4-f1200d6d85c5\") " pod="openshift-must-gather-764s8/must-gather-5wk72" Sep 30 14:59:57 crc kubenswrapper[4936]: I0930 14:59:57.892950 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7f46ad41-3b51-4de2-a9f4-f1200d6d85c5-must-gather-output\") pod \"must-gather-5wk72\" (UID: \"7f46ad41-3b51-4de2-a9f4-f1200d6d85c5\") " pod="openshift-must-gather-764s8/must-gather-5wk72" Sep 30 14:59:57 crc kubenswrapper[4936]: I0930 14:59:57.994268 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7f46ad41-3b51-4de2-a9f4-f1200d6d85c5-must-gather-output\") pod \"must-gather-5wk72\" (UID: \"7f46ad41-3b51-4de2-a9f4-f1200d6d85c5\") " pod="openshift-must-gather-764s8/must-gather-5wk72" Sep 30 14:59:57 crc kubenswrapper[4936]: I0930 14:59:57.994379 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k78w\" (UniqueName: \"kubernetes.io/projected/7f46ad41-3b51-4de2-a9f4-f1200d6d85c5-kube-api-access-7k78w\") pod \"must-gather-5wk72\" (UID: \"7f46ad41-3b51-4de2-a9f4-f1200d6d85c5\") " pod="openshift-must-gather-764s8/must-gather-5wk72" Sep 30 14:59:57 crc kubenswrapper[4936]: I0930 14:59:57.994722 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7f46ad41-3b51-4de2-a9f4-f1200d6d85c5-must-gather-output\") pod \"must-gather-5wk72\" (UID: \"7f46ad41-3b51-4de2-a9f4-f1200d6d85c5\") " pod="openshift-must-gather-764s8/must-gather-5wk72" Sep 30 14:59:58 crc kubenswrapper[4936]: I0930 14:59:58.012100 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k78w\" (UniqueName: \"kubernetes.io/projected/7f46ad41-3b51-4de2-a9f4-f1200d6d85c5-kube-api-access-7k78w\") pod \"must-gather-5wk72\" (UID: \"7f46ad41-3b51-4de2-a9f4-f1200d6d85c5\") " pod="openshift-must-gather-764s8/must-gather-5wk72" Sep 30 14:59:58 crc kubenswrapper[4936]: I0930 14:59:58.071447 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-764s8/must-gather-5wk72" Sep 30 14:59:58 crc kubenswrapper[4936]: I0930 14:59:58.550905 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-764s8/must-gather-5wk72"] Sep 30 14:59:59 crc kubenswrapper[4936]: I0930 14:59:59.131639 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-764s8/must-gather-5wk72" event={"ID":"7f46ad41-3b51-4de2-a9f4-f1200d6d85c5","Type":"ContainerStarted","Data":"362de18b3f46dcad8d48dc9efdf947737b0ea239d86a9f8ae8826327381b2c00"} Sep 30 14:59:59 crc kubenswrapper[4936]: I0930 14:59:59.132225 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-764s8/must-gather-5wk72" event={"ID":"7f46ad41-3b51-4de2-a9f4-f1200d6d85c5","Type":"ContainerStarted","Data":"f9aa55a33392f7bf4a89d8c7a5fa42253822c14b27130a5c85a3cf6ec5c7fcd2"} Sep 30 15:00:00 crc kubenswrapper[4936]: I0930 15:00:00.141546 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-764s8/must-gather-5wk72" event={"ID":"7f46ad41-3b51-4de2-a9f4-f1200d6d85c5","Type":"ContainerStarted","Data":"56fa05d4cb8d390a3fcb555f2593137a0807428d32b9e655267d3440a423bf1b"} Sep 30 15:00:00 crc kubenswrapper[4936]: I0930 15:00:00.161281 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320740-xqfpd"] Sep 30 15:00:00 crc kubenswrapper[4936]: I0930 15:00:00.162940 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-xqfpd" Sep 30 15:00:00 crc kubenswrapper[4936]: I0930 15:00:00.171223 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-764s8/must-gather-5wk72" podStartSLOduration=3.171202632 podStartE2EDuration="3.171202632s" podCreationTimestamp="2025-09-30 14:59:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:00:00.171146401 +0000 UTC m=+4850.555148702" watchObservedRunningTime="2025-09-30 15:00:00.171202632 +0000 UTC m=+4850.555204933" Sep 30 15:00:00 crc kubenswrapper[4936]: I0930 15:00:00.171497 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 15:00:00 crc kubenswrapper[4936]: I0930 15:00:00.171664 4936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 15:00:00 crc kubenswrapper[4936]: I0930 15:00:00.195528 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320740-xqfpd"] Sep 30 15:00:00 crc kubenswrapper[4936]: I0930 15:00:00.235441 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/056ab085-dea7-487a-a31c-4c8b93e0719b-secret-volume\") pod \"collect-profiles-29320740-xqfpd\" (UID: \"056ab085-dea7-487a-a31c-4c8b93e0719b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-xqfpd" Sep 30 15:00:00 crc kubenswrapper[4936]: I0930 15:00:00.235520 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/056ab085-dea7-487a-a31c-4c8b93e0719b-config-volume\") pod \"collect-profiles-29320740-xqfpd\" (UID: \"056ab085-dea7-487a-a31c-4c8b93e0719b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-xqfpd" Sep 30 15:00:00 crc kubenswrapper[4936]: I0930 15:00:00.235597 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk58x\" (UniqueName: \"kubernetes.io/projected/056ab085-dea7-487a-a31c-4c8b93e0719b-kube-api-access-vk58x\") pod \"collect-profiles-29320740-xqfpd\" (UID: \"056ab085-dea7-487a-a31c-4c8b93e0719b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-xqfpd" Sep 30 15:00:00 crc kubenswrapper[4936]: I0930 15:00:00.337588 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/056ab085-dea7-487a-a31c-4c8b93e0719b-secret-volume\") pod \"collect-profiles-29320740-xqfpd\" (UID: \"056ab085-dea7-487a-a31c-4c8b93e0719b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-xqfpd" Sep 30 15:00:00 crc kubenswrapper[4936]: I0930 15:00:00.337654 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/056ab085-dea7-487a-a31c-4c8b93e0719b-config-volume\") pod \"collect-profiles-29320740-xqfpd\" (UID: \"056ab085-dea7-487a-a31c-4c8b93e0719b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-xqfpd" Sep 30 15:00:00 crc kubenswrapper[4936]: I0930 15:00:00.337726 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk58x\" (UniqueName: \"kubernetes.io/projected/056ab085-dea7-487a-a31c-4c8b93e0719b-kube-api-access-vk58x\") pod \"collect-profiles-29320740-xqfpd\" (UID: \"056ab085-dea7-487a-a31c-4c8b93e0719b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-xqfpd" Sep 30 15:00:00 crc kubenswrapper[4936]: I0930 15:00:00.340399 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/056ab085-dea7-487a-a31c-4c8b93e0719b-config-volume\") pod \"collect-profiles-29320740-xqfpd\" (UID: \"056ab085-dea7-487a-a31c-4c8b93e0719b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-xqfpd" Sep 30 15:00:00 crc kubenswrapper[4936]: I0930 15:00:00.349653 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/056ab085-dea7-487a-a31c-4c8b93e0719b-secret-volume\") pod \"collect-profiles-29320740-xqfpd\" (UID: \"056ab085-dea7-487a-a31c-4c8b93e0719b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-xqfpd" Sep 30 15:00:00 crc kubenswrapper[4936]: I0930 15:00:00.360084 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk58x\" (UniqueName: \"kubernetes.io/projected/056ab085-dea7-487a-a31c-4c8b93e0719b-kube-api-access-vk58x\") pod \"collect-profiles-29320740-xqfpd\" (UID: \"056ab085-dea7-487a-a31c-4c8b93e0719b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-xqfpd" Sep 30 15:00:00 crc kubenswrapper[4936]: I0930 15:00:00.487675 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-xqfpd" Sep 30 15:00:01 crc kubenswrapper[4936]: I0930 15:00:01.003825 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320740-xqfpd"] Sep 30 15:00:01 crc kubenswrapper[4936]: W0930 15:00:01.005779 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod056ab085_dea7_487a_a31c_4c8b93e0719b.slice/crio-e9e5865b947f450d6df24a796b145bf7fc856fee8f73437d225e1972d27a7738 WatchSource:0}: Error finding container e9e5865b947f450d6df24a796b145bf7fc856fee8f73437d225e1972d27a7738: Status 404 returned error can't find the container with id e9e5865b947f450d6df24a796b145bf7fc856fee8f73437d225e1972d27a7738 Sep 30 15:00:01 crc kubenswrapper[4936]: I0930 15:00:01.162853 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-xqfpd" event={"ID":"056ab085-dea7-487a-a31c-4c8b93e0719b","Type":"ContainerStarted","Data":"e9e5865b947f450d6df24a796b145bf7fc856fee8f73437d225e1972d27a7738"} Sep 30 15:00:02 crc kubenswrapper[4936]: I0930 15:00:02.174565 4936 generic.go:334] "Generic (PLEG): container finished" podID="056ab085-dea7-487a-a31c-4c8b93e0719b" containerID="771fd400817f20e34a0c48232999614c7f49b38e80dbd122432784f2014d04e7" exitCode=0 Sep 30 15:00:02 crc kubenswrapper[4936]: I0930 15:00:02.174682 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-xqfpd" event={"ID":"056ab085-dea7-487a-a31c-4c8b93e0719b","Type":"ContainerDied","Data":"771fd400817f20e34a0c48232999614c7f49b38e80dbd122432784f2014d04e7"} Sep 30 15:00:03 crc kubenswrapper[4936]: I0930 15:00:03.591307 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-xqfpd" Sep 30 15:00:03 crc kubenswrapper[4936]: I0930 15:00:03.613452 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk58x\" (UniqueName: \"kubernetes.io/projected/056ab085-dea7-487a-a31c-4c8b93e0719b-kube-api-access-vk58x\") pod \"056ab085-dea7-487a-a31c-4c8b93e0719b\" (UID: \"056ab085-dea7-487a-a31c-4c8b93e0719b\") " Sep 30 15:00:03 crc kubenswrapper[4936]: I0930 15:00:03.613515 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/056ab085-dea7-487a-a31c-4c8b93e0719b-secret-volume\") pod \"056ab085-dea7-487a-a31c-4c8b93e0719b\" (UID: \"056ab085-dea7-487a-a31c-4c8b93e0719b\") " Sep 30 15:00:03 crc kubenswrapper[4936]: I0930 15:00:03.613673 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/056ab085-dea7-487a-a31c-4c8b93e0719b-config-volume\") pod \"056ab085-dea7-487a-a31c-4c8b93e0719b\" (UID: \"056ab085-dea7-487a-a31c-4c8b93e0719b\") " Sep 30 15:00:03 crc kubenswrapper[4936]: I0930 15:00:03.614744 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/056ab085-dea7-487a-a31c-4c8b93e0719b-config-volume" (OuterVolumeSpecName: "config-volume") pod "056ab085-dea7-487a-a31c-4c8b93e0719b" (UID: "056ab085-dea7-487a-a31c-4c8b93e0719b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 15:00:03 crc kubenswrapper[4936]: I0930 15:00:03.662453 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/056ab085-dea7-487a-a31c-4c8b93e0719b-kube-api-access-vk58x" (OuterVolumeSpecName: "kube-api-access-vk58x") pod "056ab085-dea7-487a-a31c-4c8b93e0719b" (UID: "056ab085-dea7-487a-a31c-4c8b93e0719b"). InnerVolumeSpecName "kube-api-access-vk58x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:00:03 crc kubenswrapper[4936]: I0930 15:00:03.663985 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/056ab085-dea7-487a-a31c-4c8b93e0719b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "056ab085-dea7-487a-a31c-4c8b93e0719b" (UID: "056ab085-dea7-487a-a31c-4c8b93e0719b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:00:03 crc kubenswrapper[4936]: I0930 15:00:03.715680 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk58x\" (UniqueName: \"kubernetes.io/projected/056ab085-dea7-487a-a31c-4c8b93e0719b-kube-api-access-vk58x\") on node \"crc\" DevicePath \"\"" Sep 30 15:00:03 crc kubenswrapper[4936]: I0930 15:00:03.716087 4936 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/056ab085-dea7-487a-a31c-4c8b93e0719b-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 15:00:03 crc kubenswrapper[4936]: I0930 15:00:03.716099 4936 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/056ab085-dea7-487a-a31c-4c8b93e0719b-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 15:00:04 crc kubenswrapper[4936]: I0930 15:00:04.200693 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-xqfpd" event={"ID":"056ab085-dea7-487a-a31c-4c8b93e0719b","Type":"ContainerDied","Data":"e9e5865b947f450d6df24a796b145bf7fc856fee8f73437d225e1972d27a7738"} Sep 30 15:00:04 crc kubenswrapper[4936]: I0930 15:00:04.201172 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9e5865b947f450d6df24a796b145bf7fc856fee8f73437d225e1972d27a7738" Sep 30 15:00:04 crc kubenswrapper[4936]: I0930 15:00:04.200720 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-xqfpd" Sep 30 15:00:04 crc kubenswrapper[4936]: I0930 15:00:04.573876 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-764s8/crc-debug-fdwn9"] Sep 30 15:00:04 crc kubenswrapper[4936]: E0930 15:00:04.574874 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="056ab085-dea7-487a-a31c-4c8b93e0719b" containerName="collect-profiles" Sep 30 15:00:04 crc kubenswrapper[4936]: I0930 15:00:04.574893 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="056ab085-dea7-487a-a31c-4c8b93e0719b" containerName="collect-profiles" Sep 30 15:00:04 crc kubenswrapper[4936]: I0930 15:00:04.575102 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="056ab085-dea7-487a-a31c-4c8b93e0719b" containerName="collect-profiles" Sep 30 15:00:04 crc kubenswrapper[4936]: I0930 15:00:04.575779 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-764s8/crc-debug-fdwn9" Sep 30 15:00:04 crc kubenswrapper[4936]: I0930 15:00:04.580474 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-764s8"/"default-dockercfg-rf594" Sep 30 15:00:04 crc kubenswrapper[4936]: I0930 15:00:04.645295 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsjmj\" (UniqueName: \"kubernetes.io/projected/b3e56aaf-ce14-4fec-b042-d021787a4db3-kube-api-access-gsjmj\") pod \"crc-debug-fdwn9\" (UID: \"b3e56aaf-ce14-4fec-b042-d021787a4db3\") " pod="openshift-must-gather-764s8/crc-debug-fdwn9" Sep 30 15:00:04 crc kubenswrapper[4936]: I0930 15:00:04.645426 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3e56aaf-ce14-4fec-b042-d021787a4db3-host\") pod \"crc-debug-fdwn9\" (UID: \"b3e56aaf-ce14-4fec-b042-d021787a4db3\") " pod="openshift-must-gather-764s8/crc-debug-fdwn9" Sep 30 15:00:04 crc kubenswrapper[4936]: I0930 15:00:04.670265 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6"] Sep 30 15:00:04 crc kubenswrapper[4936]: I0930 15:00:04.680484 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320695-6cps6"] Sep 30 15:00:04 crc kubenswrapper[4936]: I0930 15:00:04.746819 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3e56aaf-ce14-4fec-b042-d021787a4db3-host\") pod \"crc-debug-fdwn9\" (UID: \"b3e56aaf-ce14-4fec-b042-d021787a4db3\") " pod="openshift-must-gather-764s8/crc-debug-fdwn9" Sep 30 15:00:04 crc kubenswrapper[4936]: I0930 15:00:04.746987 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3e56aaf-ce14-4fec-b042-d021787a4db3-host\") pod \"crc-debug-fdwn9\" (UID: \"b3e56aaf-ce14-4fec-b042-d021787a4db3\") " pod="openshift-must-gather-764s8/crc-debug-fdwn9" Sep 30 15:00:04 crc kubenswrapper[4936]: I0930 15:00:04.747006 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsjmj\" (UniqueName: \"kubernetes.io/projected/b3e56aaf-ce14-4fec-b042-d021787a4db3-kube-api-access-gsjmj\") pod \"crc-debug-fdwn9\" (UID: \"b3e56aaf-ce14-4fec-b042-d021787a4db3\") " pod="openshift-must-gather-764s8/crc-debug-fdwn9" Sep 30 15:00:04 crc kubenswrapper[4936]: I0930 15:00:04.780592 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsjmj\" (UniqueName: \"kubernetes.io/projected/b3e56aaf-ce14-4fec-b042-d021787a4db3-kube-api-access-gsjmj\") pod \"crc-debug-fdwn9\" (UID: \"b3e56aaf-ce14-4fec-b042-d021787a4db3\") " pod="openshift-must-gather-764s8/crc-debug-fdwn9" Sep 30 15:00:04 crc kubenswrapper[4936]: I0930 15:00:04.891553 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-764s8/crc-debug-fdwn9" Sep 30 15:00:04 crc kubenswrapper[4936]: W0930 15:00:04.956921 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3e56aaf_ce14_4fec_b042_d021787a4db3.slice/crio-79f59c171fc98a27a3f46c102364f098eae62aeb9745b1fe8ac6b414addfe33b WatchSource:0}: Error finding container 79f59c171fc98a27a3f46c102364f098eae62aeb9745b1fe8ac6b414addfe33b: Status 404 returned error can't find the container with id 79f59c171fc98a27a3f46c102364f098eae62aeb9745b1fe8ac6b414addfe33b Sep 30 15:00:05 crc kubenswrapper[4936]: I0930 15:00:05.210078 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-764s8/crc-debug-fdwn9" event={"ID":"b3e56aaf-ce14-4fec-b042-d021787a4db3","Type":"ContainerStarted","Data":"79f59c171fc98a27a3f46c102364f098eae62aeb9745b1fe8ac6b414addfe33b"} Sep 30 15:00:06 crc kubenswrapper[4936]: I0930 15:00:06.223271 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-764s8/crc-debug-fdwn9" event={"ID":"b3e56aaf-ce14-4fec-b042-d021787a4db3","Type":"ContainerStarted","Data":"d6bae0cdd33ab14137a3613b4d01fadedcd9169d7d00646dbb9f918cf2141a42"} Sep 30 15:00:06 crc kubenswrapper[4936]: I0930 15:00:06.252016 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-764s8/crc-debug-fdwn9" podStartSLOduration=2.251990357 podStartE2EDuration="2.251990357s" podCreationTimestamp="2025-09-30 15:00:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:00:06.243637127 +0000 UTC m=+4856.627639448" watchObservedRunningTime="2025-09-30 15:00:06.251990357 +0000 UTC m=+4856.635992658" Sep 30 15:00:06 crc kubenswrapper[4936]: I0930 15:00:06.329583 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b80ba637-9833-410d-8ca9-012957837b73" path="/var/lib/kubelet/pods/b80ba637-9833-410d-8ca9-012957837b73/volumes" Sep 30 15:00:24 crc kubenswrapper[4936]: I0930 15:00:24.721690 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jchdr"] Sep 30 15:00:24 crc kubenswrapper[4936]: I0930 15:00:24.724833 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jchdr" Sep 30 15:00:24 crc kubenswrapper[4936]: I0930 15:00:24.743740 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jchdr"] Sep 30 15:00:24 crc kubenswrapper[4936]: I0930 15:00:24.854748 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f5d5\" (UniqueName: \"kubernetes.io/projected/7c444251-b726-49d9-a0b9-8b4988070a53-kube-api-access-4f5d5\") pod \"certified-operators-jchdr\" (UID: \"7c444251-b726-49d9-a0b9-8b4988070a53\") " pod="openshift-marketplace/certified-operators-jchdr" Sep 30 15:00:24 crc kubenswrapper[4936]: I0930 15:00:24.854812 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c444251-b726-49d9-a0b9-8b4988070a53-utilities\") pod \"certified-operators-jchdr\" (UID: \"7c444251-b726-49d9-a0b9-8b4988070a53\") " pod="openshift-marketplace/certified-operators-jchdr" Sep 30 15:00:24 crc kubenswrapper[4936]: I0930 15:00:24.856026 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c444251-b726-49d9-a0b9-8b4988070a53-catalog-content\") pod \"certified-operators-jchdr\" (UID: \"7c444251-b726-49d9-a0b9-8b4988070a53\") " pod="openshift-marketplace/certified-operators-jchdr" Sep 30 15:00:24 crc kubenswrapper[4936]: I0930 15:00:24.958739 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c444251-b726-49d9-a0b9-8b4988070a53-catalog-content\") pod \"certified-operators-jchdr\" (UID: \"7c444251-b726-49d9-a0b9-8b4988070a53\") " pod="openshift-marketplace/certified-operators-jchdr" Sep 30 15:00:24 crc kubenswrapper[4936]: I0930 15:00:24.958857 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f5d5\" (UniqueName: \"kubernetes.io/projected/7c444251-b726-49d9-a0b9-8b4988070a53-kube-api-access-4f5d5\") pod \"certified-operators-jchdr\" (UID: \"7c444251-b726-49d9-a0b9-8b4988070a53\") " pod="openshift-marketplace/certified-operators-jchdr" Sep 30 15:00:24 crc kubenswrapper[4936]: I0930 15:00:24.958901 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c444251-b726-49d9-a0b9-8b4988070a53-utilities\") pod \"certified-operators-jchdr\" (UID: \"7c444251-b726-49d9-a0b9-8b4988070a53\") " pod="openshift-marketplace/certified-operators-jchdr" Sep 30 15:00:24 crc kubenswrapper[4936]: I0930 15:00:24.959576 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c444251-b726-49d9-a0b9-8b4988070a53-utilities\") pod \"certified-operators-jchdr\" (UID: \"7c444251-b726-49d9-a0b9-8b4988070a53\") " pod="openshift-marketplace/certified-operators-jchdr" Sep 30 15:00:24 crc kubenswrapper[4936]: I0930 15:00:24.959853 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c444251-b726-49d9-a0b9-8b4988070a53-catalog-content\") pod \"certified-operators-jchdr\" (UID: \"7c444251-b726-49d9-a0b9-8b4988070a53\") " pod="openshift-marketplace/certified-operators-jchdr" Sep 30 15:00:24 crc kubenswrapper[4936]: I0930 15:00:24.988302 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f5d5\" (UniqueName: \"kubernetes.io/projected/7c444251-b726-49d9-a0b9-8b4988070a53-kube-api-access-4f5d5\") pod \"certified-operators-jchdr\" (UID: \"7c444251-b726-49d9-a0b9-8b4988070a53\") " pod="openshift-marketplace/certified-operators-jchdr" Sep 30 15:00:25 crc kubenswrapper[4936]: I0930 15:00:25.047151 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jchdr" Sep 30 15:00:25 crc kubenswrapper[4936]: I0930 15:00:25.660987 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jchdr"] Sep 30 15:00:25 crc kubenswrapper[4936]: W0930 15:00:25.674983 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c444251_b726_49d9_a0b9_8b4988070a53.slice/crio-0f1a77c88eb7cd67c87cb5628de5583c9c231f76978f540c1ab7fb0cb60acc1b WatchSource:0}: Error finding container 0f1a77c88eb7cd67c87cb5628de5583c9c231f76978f540c1ab7fb0cb60acc1b: Status 404 returned error can't find the container with id 0f1a77c88eb7cd67c87cb5628de5583c9c231f76978f540c1ab7fb0cb60acc1b Sep 30 15:00:26 crc kubenswrapper[4936]: I0930 15:00:26.411803 4936 generic.go:334] "Generic (PLEG): container finished" podID="7c444251-b726-49d9-a0b9-8b4988070a53" containerID="347ebfcfbd40d317ff98d72554c5befaee9d293b29afb9dfab100c10b7e54b9e" exitCode=0 Sep 30 15:00:26 crc kubenswrapper[4936]: I0930 15:00:26.412352 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jchdr" event={"ID":"7c444251-b726-49d9-a0b9-8b4988070a53","Type":"ContainerDied","Data":"347ebfcfbd40d317ff98d72554c5befaee9d293b29afb9dfab100c10b7e54b9e"} Sep 30 15:00:26 crc kubenswrapper[4936]: I0930 15:00:26.412386 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jchdr" event={"ID":"7c444251-b726-49d9-a0b9-8b4988070a53","Type":"ContainerStarted","Data":"0f1a77c88eb7cd67c87cb5628de5583c9c231f76978f540c1ab7fb0cb60acc1b"} Sep 30 15:00:27 crc kubenswrapper[4936]: I0930 15:00:27.425587 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jchdr" event={"ID":"7c444251-b726-49d9-a0b9-8b4988070a53","Type":"ContainerStarted","Data":"3b749f46067907f64b5e8c0547eb565c422e20cc8777455177eaace7f484bc51"} Sep 30 15:00:30 crc kubenswrapper[4936]: I0930 15:00:30.456655 4936 generic.go:334] "Generic (PLEG): container finished" podID="7c444251-b726-49d9-a0b9-8b4988070a53" containerID="3b749f46067907f64b5e8c0547eb565c422e20cc8777455177eaace7f484bc51" exitCode=0 Sep 30 15:00:30 crc kubenswrapper[4936]: I0930 15:00:30.456726 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jchdr" event={"ID":"7c444251-b726-49d9-a0b9-8b4988070a53","Type":"ContainerDied","Data":"3b749f46067907f64b5e8c0547eb565c422e20cc8777455177eaace7f484bc51"} Sep 30 15:00:33 crc kubenswrapper[4936]: I0930 15:00:33.481250 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jchdr" event={"ID":"7c444251-b726-49d9-a0b9-8b4988070a53","Type":"ContainerStarted","Data":"1e0c3bdc8e1373c0e971e9f2a4ff61910b928bcc9097f6842afd25b037bd8711"} Sep 30 15:00:33 crc kubenswrapper[4936]: I0930 15:00:33.509687 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jchdr" podStartSLOduration=3.636788652 podStartE2EDuration="9.509664647s" podCreationTimestamp="2025-09-30 15:00:24 +0000 UTC" firstStartedPulling="2025-09-30 15:00:26.414984623 +0000 UTC m=+4876.798986924" lastFinishedPulling="2025-09-30 15:00:32.287860618 +0000 UTC m=+4882.671862919" observedRunningTime="2025-09-30 15:00:33.497445301 +0000 UTC m=+4883.881447602" watchObservedRunningTime="2025-09-30 15:00:33.509664647 +0000 UTC m=+4883.893666948" Sep 30 15:00:35 crc kubenswrapper[4936]: I0930 15:00:35.051114 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jchdr" Sep 30 15:00:35 crc kubenswrapper[4936]: I0930 15:00:35.051926 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jchdr" Sep 30 15:00:36 crc kubenswrapper[4936]: I0930 15:00:36.116143 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-jchdr" podUID="7c444251-b726-49d9-a0b9-8b4988070a53" containerName="registry-server" probeResult="failure" output=< Sep 30 15:00:36 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 15:00:36 crc kubenswrapper[4936]: > Sep 30 15:00:42 crc kubenswrapper[4936]: I0930 15:00:42.427022 4936 scope.go:117] "RemoveContainer" containerID="f358bbc3c56b5e7a9ff0718b6063f2ebfc0dc9ebac2ab4d54453bccfb01a4dfc" Sep 30 15:00:45 crc kubenswrapper[4936]: I0930 15:00:45.114432 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jchdr" Sep 30 15:00:45 crc kubenswrapper[4936]: I0930 15:00:45.181783 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jchdr" Sep 30 15:00:45 crc kubenswrapper[4936]: I0930 15:00:45.355821 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jchdr"] Sep 30 15:00:46 crc kubenswrapper[4936]: I0930 15:00:46.635858 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jchdr" podUID="7c444251-b726-49d9-a0b9-8b4988070a53" containerName="registry-server" containerID="cri-o://1e0c3bdc8e1373c0e971e9f2a4ff61910b928bcc9097f6842afd25b037bd8711" gracePeriod=2 Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.299831 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jchdr" Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.484054 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c444251-b726-49d9-a0b9-8b4988070a53-utilities\") pod \"7c444251-b726-49d9-a0b9-8b4988070a53\" (UID: \"7c444251-b726-49d9-a0b9-8b4988070a53\") " Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.484393 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f5d5\" (UniqueName: \"kubernetes.io/projected/7c444251-b726-49d9-a0b9-8b4988070a53-kube-api-access-4f5d5\") pod \"7c444251-b726-49d9-a0b9-8b4988070a53\" (UID: \"7c444251-b726-49d9-a0b9-8b4988070a53\") " Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.484509 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c444251-b726-49d9-a0b9-8b4988070a53-catalog-content\") pod \"7c444251-b726-49d9-a0b9-8b4988070a53\" (UID: \"7c444251-b726-49d9-a0b9-8b4988070a53\") " Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.484791 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c444251-b726-49d9-a0b9-8b4988070a53-utilities" (OuterVolumeSpecName: "utilities") pod "7c444251-b726-49d9-a0b9-8b4988070a53" (UID: "7c444251-b726-49d9-a0b9-8b4988070a53"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.485059 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c444251-b726-49d9-a0b9-8b4988070a53-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.491812 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c444251-b726-49d9-a0b9-8b4988070a53-kube-api-access-4f5d5" (OuterVolumeSpecName: "kube-api-access-4f5d5") pod "7c444251-b726-49d9-a0b9-8b4988070a53" (UID: "7c444251-b726-49d9-a0b9-8b4988070a53"). InnerVolumeSpecName "kube-api-access-4f5d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.588612 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f5d5\" (UniqueName: \"kubernetes.io/projected/7c444251-b726-49d9-a0b9-8b4988070a53-kube-api-access-4f5d5\") on node \"crc\" DevicePath \"\"" Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.588783 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c444251-b726-49d9-a0b9-8b4988070a53-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c444251-b726-49d9-a0b9-8b4988070a53" (UID: "7c444251-b726-49d9-a0b9-8b4988070a53"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.646070 4936 generic.go:334] "Generic (PLEG): container finished" podID="7c444251-b726-49d9-a0b9-8b4988070a53" containerID="1e0c3bdc8e1373c0e971e9f2a4ff61910b928bcc9097f6842afd25b037bd8711" exitCode=0 Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.646433 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jchdr" event={"ID":"7c444251-b726-49d9-a0b9-8b4988070a53","Type":"ContainerDied","Data":"1e0c3bdc8e1373c0e971e9f2a4ff61910b928bcc9097f6842afd25b037bd8711"} Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.646459 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jchdr" event={"ID":"7c444251-b726-49d9-a0b9-8b4988070a53","Type":"ContainerDied","Data":"0f1a77c88eb7cd67c87cb5628de5583c9c231f76978f540c1ab7fb0cb60acc1b"} Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.646499 4936 scope.go:117] "RemoveContainer" containerID="1e0c3bdc8e1373c0e971e9f2a4ff61910b928bcc9097f6842afd25b037bd8711" Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.646668 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jchdr" Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.675905 4936 scope.go:117] "RemoveContainer" containerID="3b749f46067907f64b5e8c0547eb565c422e20cc8777455177eaace7f484bc51" Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.691313 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c444251-b726-49d9-a0b9-8b4988070a53-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.732765 4936 scope.go:117] "RemoveContainer" containerID="347ebfcfbd40d317ff98d72554c5befaee9d293b29afb9dfab100c10b7e54b9e" Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.732841 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jchdr"] Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.748258 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jchdr"] Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.768869 4936 scope.go:117] "RemoveContainer" containerID="1e0c3bdc8e1373c0e971e9f2a4ff61910b928bcc9097f6842afd25b037bd8711" Sep 30 15:00:47 crc kubenswrapper[4936]: E0930 15:00:47.769174 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e0c3bdc8e1373c0e971e9f2a4ff61910b928bcc9097f6842afd25b037bd8711\": container with ID starting with 1e0c3bdc8e1373c0e971e9f2a4ff61910b928bcc9097f6842afd25b037bd8711 not found: ID does not exist" containerID="1e0c3bdc8e1373c0e971e9f2a4ff61910b928bcc9097f6842afd25b037bd8711" Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.769230 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e0c3bdc8e1373c0e971e9f2a4ff61910b928bcc9097f6842afd25b037bd8711"} err="failed to get container status \"1e0c3bdc8e1373c0e971e9f2a4ff61910b928bcc9097f6842afd25b037bd8711\": rpc error: code = NotFound desc = could not find container \"1e0c3bdc8e1373c0e971e9f2a4ff61910b928bcc9097f6842afd25b037bd8711\": container with ID starting with 1e0c3bdc8e1373c0e971e9f2a4ff61910b928bcc9097f6842afd25b037bd8711 not found: ID does not exist" Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.769250 4936 scope.go:117] "RemoveContainer" containerID="3b749f46067907f64b5e8c0547eb565c422e20cc8777455177eaace7f484bc51" Sep 30 15:00:47 crc kubenswrapper[4936]: E0930 15:00:47.769509 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b749f46067907f64b5e8c0547eb565c422e20cc8777455177eaace7f484bc51\": container with ID starting with 3b749f46067907f64b5e8c0547eb565c422e20cc8777455177eaace7f484bc51 not found: ID does not exist" containerID="3b749f46067907f64b5e8c0547eb565c422e20cc8777455177eaace7f484bc51" Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.769526 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b749f46067907f64b5e8c0547eb565c422e20cc8777455177eaace7f484bc51"} err="failed to get container status \"3b749f46067907f64b5e8c0547eb565c422e20cc8777455177eaace7f484bc51\": rpc error: code = NotFound desc = could not find container \"3b749f46067907f64b5e8c0547eb565c422e20cc8777455177eaace7f484bc51\": container with ID starting with 3b749f46067907f64b5e8c0547eb565c422e20cc8777455177eaace7f484bc51 not found: ID does not exist" Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.769539 4936 scope.go:117] "RemoveContainer" containerID="347ebfcfbd40d317ff98d72554c5befaee9d293b29afb9dfab100c10b7e54b9e" Sep 30 15:00:47 crc kubenswrapper[4936]: E0930 15:00:47.769963 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"347ebfcfbd40d317ff98d72554c5befaee9d293b29afb9dfab100c10b7e54b9e\": container with ID starting with 347ebfcfbd40d317ff98d72554c5befaee9d293b29afb9dfab100c10b7e54b9e not found: ID does not exist" containerID="347ebfcfbd40d317ff98d72554c5befaee9d293b29afb9dfab100c10b7e54b9e" Sep 30 15:00:47 crc kubenswrapper[4936]: I0930 15:00:47.769982 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"347ebfcfbd40d317ff98d72554c5befaee9d293b29afb9dfab100c10b7e54b9e"} err="failed to get container status \"347ebfcfbd40d317ff98d72554c5befaee9d293b29afb9dfab100c10b7e54b9e\": rpc error: code = NotFound desc = could not find container \"347ebfcfbd40d317ff98d72554c5befaee9d293b29afb9dfab100c10b7e54b9e\": container with ID starting with 347ebfcfbd40d317ff98d72554c5befaee9d293b29afb9dfab100c10b7e54b9e not found: ID does not exist" Sep 30 15:00:48 crc kubenswrapper[4936]: I0930 15:00:48.250412 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:00:48 crc kubenswrapper[4936]: I0930 15:00:48.250835 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:00:48 crc kubenswrapper[4936]: I0930 15:00:48.331070 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c444251-b726-49d9-a0b9-8b4988070a53" path="/var/lib/kubelet/pods/7c444251-b726-49d9-a0b9-8b4988070a53/volumes" Sep 30 15:01:00 crc kubenswrapper[4936]: I0930 15:01:00.161621 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29320741-lhs6q"] Sep 30 15:01:00 crc kubenswrapper[4936]: E0930 15:01:00.162465 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c444251-b726-49d9-a0b9-8b4988070a53" containerName="registry-server" Sep 30 15:01:00 crc kubenswrapper[4936]: I0930 15:01:00.162479 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c444251-b726-49d9-a0b9-8b4988070a53" containerName="registry-server" Sep 30 15:01:00 crc kubenswrapper[4936]: E0930 15:01:00.162504 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c444251-b726-49d9-a0b9-8b4988070a53" containerName="extract-utilities" Sep 30 15:01:00 crc kubenswrapper[4936]: I0930 15:01:00.162510 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c444251-b726-49d9-a0b9-8b4988070a53" containerName="extract-utilities" Sep 30 15:01:00 crc kubenswrapper[4936]: E0930 15:01:00.162536 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c444251-b726-49d9-a0b9-8b4988070a53" containerName="extract-content" Sep 30 15:01:00 crc kubenswrapper[4936]: I0930 15:01:00.162542 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c444251-b726-49d9-a0b9-8b4988070a53" containerName="extract-content" Sep 30 15:01:00 crc kubenswrapper[4936]: I0930 15:01:00.162720 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c444251-b726-49d9-a0b9-8b4988070a53" containerName="registry-server" Sep 30 15:01:00 crc kubenswrapper[4936]: I0930 15:01:00.163407 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320741-lhs6q" Sep 30 15:01:00 crc kubenswrapper[4936]: I0930 15:01:00.200216 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320741-lhs6q"] Sep 30 15:01:00 crc kubenswrapper[4936]: I0930 15:01:00.214037 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0e7875-c029-4f42-928e-34e1d276bb70-combined-ca-bundle\") pod \"keystone-cron-29320741-lhs6q\" (UID: \"ab0e7875-c029-4f42-928e-34e1d276bb70\") " pod="openstack/keystone-cron-29320741-lhs6q" Sep 30 15:01:00 crc kubenswrapper[4936]: I0930 15:01:00.214236 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ab0e7875-c029-4f42-928e-34e1d276bb70-fernet-keys\") pod \"keystone-cron-29320741-lhs6q\" (UID: \"ab0e7875-c029-4f42-928e-34e1d276bb70\") " pod="openstack/keystone-cron-29320741-lhs6q" Sep 30 15:01:00 crc kubenswrapper[4936]: I0930 15:01:00.214316 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0e7875-c029-4f42-928e-34e1d276bb70-config-data\") pod \"keystone-cron-29320741-lhs6q\" (UID: \"ab0e7875-c029-4f42-928e-34e1d276bb70\") " pod="openstack/keystone-cron-29320741-lhs6q" Sep 30 15:01:00 crc kubenswrapper[4936]: I0930 15:01:00.214533 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjmwd\" (UniqueName: \"kubernetes.io/projected/ab0e7875-c029-4f42-928e-34e1d276bb70-kube-api-access-bjmwd\") pod \"keystone-cron-29320741-lhs6q\" (UID: \"ab0e7875-c029-4f42-928e-34e1d276bb70\") " pod="openstack/keystone-cron-29320741-lhs6q" Sep 30 15:01:00 crc kubenswrapper[4936]: I0930 15:01:00.319465 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjmwd\" (UniqueName: \"kubernetes.io/projected/ab0e7875-c029-4f42-928e-34e1d276bb70-kube-api-access-bjmwd\") pod \"keystone-cron-29320741-lhs6q\" (UID: \"ab0e7875-c029-4f42-928e-34e1d276bb70\") " pod="openstack/keystone-cron-29320741-lhs6q" Sep 30 15:01:00 crc kubenswrapper[4936]: I0930 15:01:00.319576 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0e7875-c029-4f42-928e-34e1d276bb70-combined-ca-bundle\") pod \"keystone-cron-29320741-lhs6q\" (UID: \"ab0e7875-c029-4f42-928e-34e1d276bb70\") " pod="openstack/keystone-cron-29320741-lhs6q" Sep 30 15:01:00 crc kubenswrapper[4936]: I0930 15:01:00.319633 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ab0e7875-c029-4f42-928e-34e1d276bb70-fernet-keys\") pod \"keystone-cron-29320741-lhs6q\" (UID: \"ab0e7875-c029-4f42-928e-34e1d276bb70\") " pod="openstack/keystone-cron-29320741-lhs6q" Sep 30 15:01:00 crc kubenswrapper[4936]: I0930 15:01:00.319656 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0e7875-c029-4f42-928e-34e1d276bb70-config-data\") pod \"keystone-cron-29320741-lhs6q\" (UID: \"ab0e7875-c029-4f42-928e-34e1d276bb70\") " pod="openstack/keystone-cron-29320741-lhs6q" Sep 30 15:01:00 crc kubenswrapper[4936]: I0930 15:01:00.327378 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0e7875-c029-4f42-928e-34e1d276bb70-combined-ca-bundle\") pod \"keystone-cron-29320741-lhs6q\" (UID: \"ab0e7875-c029-4f42-928e-34e1d276bb70\") " pod="openstack/keystone-cron-29320741-lhs6q" Sep 30 15:01:00 crc kubenswrapper[4936]: I0930 15:01:00.341234 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0e7875-c029-4f42-928e-34e1d276bb70-config-data\") pod \"keystone-cron-29320741-lhs6q\" (UID: \"ab0e7875-c029-4f42-928e-34e1d276bb70\") " pod="openstack/keystone-cron-29320741-lhs6q" Sep 30 15:01:00 crc kubenswrapper[4936]: I0930 15:01:00.353240 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ab0e7875-c029-4f42-928e-34e1d276bb70-fernet-keys\") pod \"keystone-cron-29320741-lhs6q\" (UID: \"ab0e7875-c029-4f42-928e-34e1d276bb70\") " pod="openstack/keystone-cron-29320741-lhs6q" Sep 30 15:01:00 crc kubenswrapper[4936]: I0930 15:01:00.355862 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjmwd\" (UniqueName: \"kubernetes.io/projected/ab0e7875-c029-4f42-928e-34e1d276bb70-kube-api-access-bjmwd\") pod \"keystone-cron-29320741-lhs6q\" (UID: \"ab0e7875-c029-4f42-928e-34e1d276bb70\") " pod="openstack/keystone-cron-29320741-lhs6q" Sep 30 15:01:00 crc kubenswrapper[4936]: I0930 15:01:00.480596 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320741-lhs6q" Sep 30 15:01:01 crc kubenswrapper[4936]: I0930 15:01:01.093427 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320741-lhs6q"] Sep 30 15:01:01 crc kubenswrapper[4936]: W0930 15:01:01.099586 4936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab0e7875_c029_4f42_928e_34e1d276bb70.slice/crio-a5957366a032d3844b6e31de8ecc765ea8d78995cb91eb12be6d1c85c9ee3223 WatchSource:0}: Error finding container a5957366a032d3844b6e31de8ecc765ea8d78995cb91eb12be6d1c85c9ee3223: Status 404 returned error can't find the container with id a5957366a032d3844b6e31de8ecc765ea8d78995cb91eb12be6d1c85c9ee3223 Sep 30 15:01:01 crc kubenswrapper[4936]: I0930 15:01:01.790804 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320741-lhs6q" event={"ID":"ab0e7875-c029-4f42-928e-34e1d276bb70","Type":"ContainerStarted","Data":"462a634ac01774d476025c8900a478435a28f1166defa9f45ee31e99a093d7ab"} Sep 30 15:01:01 crc kubenswrapper[4936]: I0930 15:01:01.792241 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320741-lhs6q" event={"ID":"ab0e7875-c029-4f42-928e-34e1d276bb70","Type":"ContainerStarted","Data":"a5957366a032d3844b6e31de8ecc765ea8d78995cb91eb12be6d1c85c9ee3223"} Sep 30 15:01:01 crc kubenswrapper[4936]: I0930 15:01:01.809221 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29320741-lhs6q" podStartSLOduration=1.809200669 podStartE2EDuration="1.809200669s" podCreationTimestamp="2025-09-30 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:01:01.808434248 +0000 UTC m=+4912.192436559" watchObservedRunningTime="2025-09-30 15:01:01.809200669 +0000 UTC m=+4912.193202970" Sep 30 15:01:06 crc kubenswrapper[4936]: I0930 15:01:06.886126 4936 generic.go:334] "Generic (PLEG): container finished" podID="ab0e7875-c029-4f42-928e-34e1d276bb70" containerID="462a634ac01774d476025c8900a478435a28f1166defa9f45ee31e99a093d7ab" exitCode=0 Sep 30 15:01:06 crc kubenswrapper[4936]: I0930 15:01:06.886179 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320741-lhs6q" event={"ID":"ab0e7875-c029-4f42-928e-34e1d276bb70","Type":"ContainerDied","Data":"462a634ac01774d476025c8900a478435a28f1166defa9f45ee31e99a093d7ab"} Sep 30 15:01:08 crc kubenswrapper[4936]: I0930 15:01:08.882207 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320741-lhs6q" Sep 30 15:01:08 crc kubenswrapper[4936]: I0930 15:01:08.917263 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320741-lhs6q" event={"ID":"ab0e7875-c029-4f42-928e-34e1d276bb70","Type":"ContainerDied","Data":"a5957366a032d3844b6e31de8ecc765ea8d78995cb91eb12be6d1c85c9ee3223"} Sep 30 15:01:08 crc kubenswrapper[4936]: I0930 15:01:08.917317 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5957366a032d3844b6e31de8ecc765ea8d78995cb91eb12be6d1c85c9ee3223" Sep 30 15:01:08 crc kubenswrapper[4936]: I0930 15:01:08.917417 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320741-lhs6q" Sep 30 15:01:08 crc kubenswrapper[4936]: I0930 15:01:08.923967 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ab0e7875-c029-4f42-928e-34e1d276bb70-fernet-keys\") pod \"ab0e7875-c029-4f42-928e-34e1d276bb70\" (UID: \"ab0e7875-c029-4f42-928e-34e1d276bb70\") " Sep 30 15:01:08 crc kubenswrapper[4936]: I0930 15:01:08.924002 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0e7875-c029-4f42-928e-34e1d276bb70-combined-ca-bundle\") pod \"ab0e7875-c029-4f42-928e-34e1d276bb70\" (UID: \"ab0e7875-c029-4f42-928e-34e1d276bb70\") " Sep 30 15:01:08 crc kubenswrapper[4936]: I0930 15:01:08.924041 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjmwd\" (UniqueName: \"kubernetes.io/projected/ab0e7875-c029-4f42-928e-34e1d276bb70-kube-api-access-bjmwd\") pod \"ab0e7875-c029-4f42-928e-34e1d276bb70\" (UID: \"ab0e7875-c029-4f42-928e-34e1d276bb70\") " Sep 30 15:01:08 crc kubenswrapper[4936]: I0930 15:01:08.924174 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0e7875-c029-4f42-928e-34e1d276bb70-config-data\") pod \"ab0e7875-c029-4f42-928e-34e1d276bb70\" (UID: \"ab0e7875-c029-4f42-928e-34e1d276bb70\") " Sep 30 15:01:08 crc kubenswrapper[4936]: I0930 15:01:08.946066 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab0e7875-c029-4f42-928e-34e1d276bb70-kube-api-access-bjmwd" (OuterVolumeSpecName: "kube-api-access-bjmwd") pod "ab0e7875-c029-4f42-928e-34e1d276bb70" (UID: "ab0e7875-c029-4f42-928e-34e1d276bb70"). InnerVolumeSpecName "kube-api-access-bjmwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:01:08 crc kubenswrapper[4936]: I0930 15:01:08.947016 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0e7875-c029-4f42-928e-34e1d276bb70-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ab0e7875-c029-4f42-928e-34e1d276bb70" (UID: "ab0e7875-c029-4f42-928e-34e1d276bb70"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:01:09 crc kubenswrapper[4936]: I0930 15:01:09.006465 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0e7875-c029-4f42-928e-34e1d276bb70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab0e7875-c029-4f42-928e-34e1d276bb70" (UID: "ab0e7875-c029-4f42-928e-34e1d276bb70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:01:09 crc kubenswrapper[4936]: I0930 15:01:09.026901 4936 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ab0e7875-c029-4f42-928e-34e1d276bb70-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 15:01:09 crc kubenswrapper[4936]: I0930 15:01:09.026935 4936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0e7875-c029-4f42-928e-34e1d276bb70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 15:01:09 crc kubenswrapper[4936]: I0930 15:01:09.026949 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjmwd\" (UniqueName: \"kubernetes.io/projected/ab0e7875-c029-4f42-928e-34e1d276bb70-kube-api-access-bjmwd\") on node \"crc\" DevicePath \"\"" Sep 30 15:01:09 crc kubenswrapper[4936]: I0930 15:01:09.062971 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0e7875-c029-4f42-928e-34e1d276bb70-config-data" (OuterVolumeSpecName: "config-data") pod "ab0e7875-c029-4f42-928e-34e1d276bb70" (UID: "ab0e7875-c029-4f42-928e-34e1d276bb70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:01:09 crc kubenswrapper[4936]: I0930 15:01:09.128470 4936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0e7875-c029-4f42-928e-34e1d276bb70-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 15:01:18 crc kubenswrapper[4936]: I0930 15:01:18.249795 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:01:18 crc kubenswrapper[4936]: I0930 15:01:18.250327 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:01:42 crc kubenswrapper[4936]: I0930 15:01:42.610183 4936 scope.go:117] "RemoveContainer" containerID="ee3cf0a741c7169ab58562adec1d123ee6791cd5269b7790e49f64d417009820" Sep 30 15:01:48 crc kubenswrapper[4936]: I0930 15:01:48.249946 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:01:48 crc kubenswrapper[4936]: I0930 15:01:48.250605 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:01:48 crc kubenswrapper[4936]: I0930 15:01:48.250648 4936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" Sep 30 15:01:48 crc kubenswrapper[4936]: I0930 15:01:48.251468 4936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9"} pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 15:01:48 crc kubenswrapper[4936]: I0930 15:01:48.251523 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" containerID="cri-o://d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" gracePeriod=600 Sep 30 15:01:48 crc kubenswrapper[4936]: E0930 15:01:48.380884 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:01:49 crc kubenswrapper[4936]: I0930 15:01:49.280258 4936 generic.go:334] "Generic (PLEG): container finished" podID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" exitCode=0 Sep 30 15:01:49 crc kubenswrapper[4936]: I0930 15:01:49.280300 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerDied","Data":"d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9"} Sep 30 15:01:49 crc kubenswrapper[4936]: I0930 15:01:49.280406 4936 scope.go:117] "RemoveContainer" containerID="703cf0e717c235b899f5fdee3e05487c1763c18996c91683b2bb28080783463e" Sep 30 15:01:49 crc kubenswrapper[4936]: I0930 15:01:49.280930 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:01:49 crc kubenswrapper[4936]: E0930 15:01:49.281157 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:01:59 crc kubenswrapper[4936]: I0930 15:01:59.139386 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-75d6545796-72m9v_68de6cb4-15c5-4c0e-b924-c2fff7f03eaf/barbican-api/0.log" Sep 30 15:01:59 crc kubenswrapper[4936]: I0930 15:01:59.228728 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-75d6545796-72m9v_68de6cb4-15c5-4c0e-b924-c2fff7f03eaf/barbican-api-log/0.log" Sep 30 15:01:59 crc kubenswrapper[4936]: I0930 15:01:59.419454 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-77c7874cdd-k7vml_6b5ee5ab-2208-44b0-a464-f813f6314c26/barbican-keystone-listener/0.log" Sep 30 15:01:59 crc kubenswrapper[4936]: I0930 15:01:59.522576 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-77c7874cdd-k7vml_6b5ee5ab-2208-44b0-a464-f813f6314c26/barbican-keystone-listener-log/0.log" Sep 30 15:01:59 crc kubenswrapper[4936]: I0930 15:01:59.721820 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-cf994c9f9-p76xq_c089d3fc-0428-4e46-8796-efa4f3df1fb6/barbican-worker/0.log" Sep 30 15:01:59 crc kubenswrapper[4936]: I0930 15:01:59.751609 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-cf994c9f9-p76xq_c089d3fc-0428-4e46-8796-efa4f3df1fb6/barbican-worker-log/0.log" Sep 30 15:01:59 crc kubenswrapper[4936]: I0930 15:01:59.964872 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-k4t2k_418f655d-bae8-4905-8dfc-770612a750c4/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:02:00 crc kubenswrapper[4936]: I0930 15:02:00.247832 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_70edb384-47db-473c-95d3-28a20a1857e0/proxy-httpd/0.log" Sep 30 15:02:00 crc kubenswrapper[4936]: I0930 15:02:00.272523 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_70edb384-47db-473c-95d3-28a20a1857e0/ceilometer-central-agent/0.log" Sep 30 15:02:00 crc kubenswrapper[4936]: I0930 15:02:00.335078 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_70edb384-47db-473c-95d3-28a20a1857e0/ceilometer-notification-agent/0.log" Sep 30 15:02:00 crc kubenswrapper[4936]: I0930 15:02:00.484793 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_70edb384-47db-473c-95d3-28a20a1857e0/sg-core/0.log" Sep 30 15:02:00 crc kubenswrapper[4936]: I0930 15:02:00.549156 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-clkzp_12c8fabd-e2f8-4073-af4a-21fde9a45d85/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:02:01 crc kubenswrapper[4936]: I0930 15:02:01.045213 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-w7l6q_11eb924e-ede4-4f91-a053-946c9951cf0e/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:02:01 crc kubenswrapper[4936]: I0930 15:02:01.230968 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_da5b20b7-ae2f-4d19-9f5f-f4c4404868aa/cinder-api-log/0.log" Sep 30 15:02:01 crc kubenswrapper[4936]: I0930 15:02:01.349104 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_da5b20b7-ae2f-4d19-9f5f-f4c4404868aa/cinder-api/0.log" Sep 30 15:02:01 crc kubenswrapper[4936]: I0930 15:02:01.600963 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_2d6eb814-5e27-493b-b63e-e8eddf561330/probe/0.log" Sep 30 15:02:01 crc kubenswrapper[4936]: I0930 15:02:01.803815 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_2d6eb814-5e27-493b-b63e-e8eddf561330/cinder-backup/0.log" Sep 30 15:02:02 crc kubenswrapper[4936]: I0930 15:02:02.005004 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8dbca145-2e46-484f-9676-17bde0b6fe26/cinder-scheduler/0.log" Sep 30 15:02:02 crc kubenswrapper[4936]: I0930 15:02:02.143832 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8dbca145-2e46-484f-9676-17bde0b6fe26/probe/0.log" Sep 30 15:02:02 crc kubenswrapper[4936]: I0930 15:02:02.423154 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_c4852707-8aff-49ed-b929-3bdcf9cd921a/cinder-volume/0.log" Sep 30 15:02:02 crc kubenswrapper[4936]: I0930 15:02:02.515827 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_c4852707-8aff-49ed-b929-3bdcf9cd921a/probe/0.log" Sep 30 15:02:02 crc kubenswrapper[4936]: I0930 15:02:02.647316 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-p46lf_9cab680a-f90c-4086-96f4-66c47ec4e497/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:02:03 crc kubenswrapper[4936]: I0930 15:02:03.315528 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:02:03 crc kubenswrapper[4936]: E0930 15:02:03.316108 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:02:03 crc kubenswrapper[4936]: I0930 15:02:03.426739 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77766fdf55-rzbvr_e1481c32-550d-4129-856d-8bc79389c0d3/init/0.log" Sep 30 15:02:03 crc kubenswrapper[4936]: I0930 15:02:03.476011 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-ztxc2_ba550a84-9368-4cdb-8e5a-d474797cdd33/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:02:03 crc kubenswrapper[4936]: I0930 15:02:03.704667 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77766fdf55-rzbvr_e1481c32-550d-4129-856d-8bc79389c0d3/init/0.log" Sep 30 15:02:03 crc kubenswrapper[4936]: I0930 15:02:03.839010 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e9446dfe-8843-48eb-8514-bccc85f0727e/glance-httpd/0.log" Sep 30 15:02:03 crc kubenswrapper[4936]: I0930 15:02:03.840658 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77766fdf55-rzbvr_e1481c32-550d-4129-856d-8bc79389c0d3/dnsmasq-dns/0.log" Sep 30 15:02:03 crc kubenswrapper[4936]: I0930 15:02:03.922009 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e9446dfe-8843-48eb-8514-bccc85f0727e/glance-log/0.log" Sep 30 15:02:04 crc kubenswrapper[4936]: I0930 15:02:04.074912 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3f555e7b-a6ae-4ab9-b5f1-d89581768669/glance-httpd/0.log" Sep 30 15:02:04 crc kubenswrapper[4936]: I0930 15:02:04.088754 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3f555e7b-a6ae-4ab9-b5f1-d89581768669/glance-log/0.log" Sep 30 15:02:04 crc kubenswrapper[4936]: I0930 15:02:04.446424 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7b866fc884-w2td6_1e28ad1d-adf7-4316-9df6-db8a7c1e3933/horizon/2.log" Sep 30 15:02:04 crc kubenswrapper[4936]: I0930 15:02:04.477922 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7b866fc884-w2td6_1e28ad1d-adf7-4316-9df6-db8a7c1e3933/horizon/1.log" Sep 30 15:02:04 crc kubenswrapper[4936]: I0930 15:02:04.557226 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7b866fc884-w2td6_1e28ad1d-adf7-4316-9df6-db8a7c1e3933/horizon-log/0.log" Sep 30 15:02:05 crc kubenswrapper[4936]: I0930 15:02:05.105186 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-r27jw_9c5c3ed5-0905-48db-aab6-0d2489fc7d42/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:02:05 crc kubenswrapper[4936]: I0930 15:02:05.207195 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-6vjtl_73455681-e4b9-4313-991f-a00d4fab6d26/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:02:05 crc kubenswrapper[4936]: I0930 15:02:05.442055 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320681-l9b2f_9476f1f4-61fa-4d56-a54b-cf28db2e0d47/keystone-cron/0.log" Sep 30 15:02:05 crc kubenswrapper[4936]: I0930 15:02:05.495492 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7f8b6d55dd-xrxpc_57c8a1c9-ff09-4e19-90d2-e7552e497695/keystone-api/0.log" Sep 30 15:02:05 crc kubenswrapper[4936]: I0930 15:02:05.642043 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320741-lhs6q_ab0e7875-c029-4f42-928e-34e1d276bb70/keystone-cron/0.log" Sep 30 15:02:05 crc kubenswrapper[4936]: I0930 15:02:05.794895 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_fdabb8fe-7d22-4bd5-8676-378dc4500f6e/kube-state-metrics/0.log" Sep 30 15:02:05 crc kubenswrapper[4936]: I0930 15:02:05.913947 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-4hbc4_acb77378-b2f6-48a5-b156-0c983ebde855/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:02:06 crc kubenswrapper[4936]: I0930 15:02:06.219181 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_b4389a75-ea04-4a03-97df-6063474dd74e/manila-api-log/0.log" Sep 30 15:02:06 crc kubenswrapper[4936]: I0930 15:02:06.226612 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_b4389a75-ea04-4a03-97df-6063474dd74e/manila-api/0.log" Sep 30 15:02:06 crc kubenswrapper[4936]: I0930 15:02:06.429689 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_8fa47185-cf71-4145-b46f-f524902914f3/probe/0.log" Sep 30 15:02:06 crc kubenswrapper[4936]: I0930 15:02:06.478554 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_8fa47185-cf71-4145-b46f-f524902914f3/manila-scheduler/0.log" Sep 30 15:02:06 crc kubenswrapper[4936]: I0930 15:02:06.558655 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_a1ecfa95-cc09-43e4-8d90-a65e4f6f74de/manila-share/0.log" Sep 30 15:02:06 crc kubenswrapper[4936]: I0930 15:02:06.669436 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_a1ecfa95-cc09-43e4-8d90-a65e4f6f74de/probe/0.log" Sep 30 15:02:07 crc kubenswrapper[4936]: I0930 15:02:07.077748 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-78b5b9486f-frfk9_9aa1f4b8-b399-4cf0-8d95-12a1eca674a7/neutron-api/0.log" Sep 30 15:02:07 crc kubenswrapper[4936]: I0930 15:02:07.150056 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-78b5b9486f-frfk9_9aa1f4b8-b399-4cf0-8d95-12a1eca674a7/neutron-httpd/0.log" Sep 30 15:02:07 crc kubenswrapper[4936]: I0930 15:02:07.321918 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-zbjxg_bf6cd6ff-b9ca-4f66-8978-0394e03fe76c/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:02:08 crc kubenswrapper[4936]: I0930 15:02:08.117155 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_78596c14-d0a4-444c-8096-962a9359418a/nova-api-log/0.log" Sep 30 15:02:08 crc kubenswrapper[4936]: I0930 15:02:08.455483 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a9dca4ae-4150-4d74-9ce1-bd3a5fbd45ae/nova-cell0-conductor-conductor/0.log" Sep 30 15:02:08 crc kubenswrapper[4936]: I0930 15:02:08.714326 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_78596c14-d0a4-444c-8096-962a9359418a/nova-api-api/0.log" Sep 30 15:02:09 crc kubenswrapper[4936]: I0930 15:02:09.059259 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_d9ca401b-9423-44d6-a87a-b1d5cc37b381/nova-cell1-conductor-conductor/0.log" Sep 30 15:02:09 crc kubenswrapper[4936]: I0930 15:02:09.201309 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_54f05d80-47b0-406c-a0be-856756410f2a/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 15:02:09 crc kubenswrapper[4936]: I0930 15:02:09.413824 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-s8nsl_32b191a4-92aa-4f6a-998e-0877753b109d/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:02:09 crc kubenswrapper[4936]: I0930 15:02:09.876835 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f5e4f5cf-48f7-4f5a-a503-cd4d57174087/nova-metadata-log/0.log" Sep 30 15:02:10 crc kubenswrapper[4936]: I0930 15:02:10.543170 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_44f3fb8e-7a0f-4e09-877d-eb823eac2b78/nova-scheduler-scheduler/0.log" Sep 30 15:02:10 crc kubenswrapper[4936]: I0930 15:02:10.756960 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2544e332-54a0-46cc-8077-417e83eed982/mysql-bootstrap/0.log" Sep 30 15:02:10 crc kubenswrapper[4936]: I0930 15:02:10.987084 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2544e332-54a0-46cc-8077-417e83eed982/mysql-bootstrap/0.log" Sep 30 15:02:11 crc kubenswrapper[4936]: I0930 15:02:11.023539 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2544e332-54a0-46cc-8077-417e83eed982/galera/0.log" Sep 30 15:02:11 crc kubenswrapper[4936]: I0930 15:02:11.354015 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4205821a-580b-4f4c-9e89-9fa6aae93378/mysql-bootstrap/0.log" Sep 30 15:02:11 crc kubenswrapper[4936]: I0930 15:02:11.557704 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4205821a-580b-4f4c-9e89-9fa6aae93378/mysql-bootstrap/0.log" Sep 30 15:02:11 crc kubenswrapper[4936]: I0930 15:02:11.740524 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4205821a-580b-4f4c-9e89-9fa6aae93378/galera/0.log" Sep 30 15:02:11 crc kubenswrapper[4936]: I0930 15:02:11.887453 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f5e4f5cf-48f7-4f5a-a503-cd4d57174087/nova-metadata-metadata/0.log" Sep 30 15:02:11 crc kubenswrapper[4936]: I0930 15:02:11.972994 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a0555978-f34e-4ada-9e39-513b4c199109/openstackclient/0.log" Sep 30 15:02:12 crc kubenswrapper[4936]: I0930 15:02:12.192182 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-k5lgl_491cf6ce-e945-4bd0-b811-b24eed9fcc12/ovn-controller/0.log" Sep 30 15:02:12 crc kubenswrapper[4936]: I0930 15:02:12.963437 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-krf5p_9443e0ef-1389-4d6a-a44d-f9863071e734/openstack-network-exporter/0.log" Sep 30 15:02:13 crc kubenswrapper[4936]: I0930 15:02:13.166838 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-747gv_2654d46a-f44e-45b2-862d-55e5eda229b7/ovsdb-server-init/0.log" Sep 30 15:02:13 crc kubenswrapper[4936]: I0930 15:02:13.391687 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-747gv_2654d46a-f44e-45b2-862d-55e5eda229b7/ovsdb-server-init/0.log" Sep 30 15:02:13 crc kubenswrapper[4936]: I0930 15:02:13.439618 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-747gv_2654d46a-f44e-45b2-862d-55e5eda229b7/ovsdb-server/0.log" Sep 30 15:02:13 crc kubenswrapper[4936]: I0930 15:02:13.463943 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-747gv_2654d46a-f44e-45b2-862d-55e5eda229b7/ovs-vswitchd/0.log" Sep 30 15:02:13 crc kubenswrapper[4936]: I0930 15:02:13.774391 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-sqxqk_70583a2b-3a7e-48fb-a59e-32778aee08fb/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:02:13 crc kubenswrapper[4936]: I0930 15:02:13.954133 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f1be0b97-e1de-4660-bd26-ec0a106cde3d/openstack-network-exporter/0.log" Sep 30 15:02:14 crc kubenswrapper[4936]: I0930 15:02:14.530608 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c2a97be5-d38b-4352-883e-1efaf06ce24e/openstack-network-exporter/0.log" Sep 30 15:02:14 crc kubenswrapper[4936]: I0930 15:02:14.617663 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f1be0b97-e1de-4660-bd26-ec0a106cde3d/ovn-northd/0.log" Sep 30 15:02:14 crc kubenswrapper[4936]: I0930 15:02:14.772526 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c2a97be5-d38b-4352-883e-1efaf06ce24e/ovsdbserver-nb/0.log" Sep 30 15:02:14 crc kubenswrapper[4936]: I0930 15:02:14.949454 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9d699954-e8fb-482d-83ea-a131998407a1/openstack-network-exporter/0.log" Sep 30 15:02:15 crc kubenswrapper[4936]: I0930 15:02:15.066032 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9d699954-e8fb-482d-83ea-a131998407a1/ovsdbserver-sb/0.log" Sep 30 15:02:15 crc kubenswrapper[4936]: I0930 15:02:15.344443 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-564967b568-8jhvh_471904cd-677c-4409-b641-15d34de36dbe/placement-api/0.log" Sep 30 15:02:15 crc kubenswrapper[4936]: I0930 15:02:15.473921 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-564967b568-8jhvh_471904cd-677c-4409-b641-15d34de36dbe/placement-log/0.log" Sep 30 15:02:15 crc kubenswrapper[4936]: I0930 15:02:15.656424 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ba18f440-0c9a-45d0-a1de-9f363bc654cf/setup-container/0.log" Sep 30 15:02:15 crc kubenswrapper[4936]: I0930 15:02:15.997506 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ba18f440-0c9a-45d0-a1de-9f363bc654cf/rabbitmq/0.log" Sep 30 15:02:16 crc kubenswrapper[4936]: I0930 15:02:16.036633 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ba18f440-0c9a-45d0-a1de-9f363bc654cf/setup-container/0.log" Sep 30 15:02:16 crc kubenswrapper[4936]: I0930 15:02:16.301094 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fd17158a-07d3-477e-8aa6-d03c3cb277c8/setup-container/0.log" Sep 30 15:02:16 crc kubenswrapper[4936]: I0930 15:02:16.566876 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fd17158a-07d3-477e-8aa6-d03c3cb277c8/rabbitmq/0.log" Sep 30 15:02:16 crc kubenswrapper[4936]: I0930 15:02:16.684161 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fd17158a-07d3-477e-8aa6-d03c3cb277c8/setup-container/0.log" Sep 30 15:02:16 crc kubenswrapper[4936]: I0930 15:02:16.875182 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-xcfj2_d8b066d6-19ec-4267-8793-cfe95b74624f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:02:16 crc kubenswrapper[4936]: I0930 15:02:16.946100 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-gdz2f_e4365ea1-ca48-47bf-af32-3e82c0a5da8f/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:02:17 crc kubenswrapper[4936]: I0930 15:02:17.302784 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-5jwct_3f95c4be-ca65-49a0-90f5-9b36926fe423/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:02:17 crc kubenswrapper[4936]: I0930 15:02:17.315162 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:02:17 crc kubenswrapper[4936]: E0930 15:02:17.315498 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:02:17 crc kubenswrapper[4936]: I0930 15:02:17.691524 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-cnrw6_05c30103-ea3f-41b7-82d1-73f43681e4e4/ssh-known-hosts-edpm-deployment/0.log" Sep 30 15:02:17 crc kubenswrapper[4936]: I0930 15:02:17.732403 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_87e335f7-bd98-45d0-a733-b2fc2dd3076e/tempest-tests-tempest-tests-runner/0.log" Sep 30 15:02:18 crc kubenswrapper[4936]: I0930 15:02:18.004652 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_7f8c038d-9623-40b0-b7d1-5a0f66caf6bd/test-operator-logs-container/0.log" Sep 30 15:02:18 crc kubenswrapper[4936]: I0930 15:02:18.299120 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-zskgc_d5817c4b-0566-4854-84c9-ad9a69b78172/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:02:25 crc kubenswrapper[4936]: I0930 15:02:25.697483 4936 generic.go:334] "Generic (PLEG): container finished" podID="b3e56aaf-ce14-4fec-b042-d021787a4db3" containerID="d6bae0cdd33ab14137a3613b4d01fadedcd9169d7d00646dbb9f918cf2141a42" exitCode=0 Sep 30 15:02:25 crc kubenswrapper[4936]: I0930 15:02:25.697794 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-764s8/crc-debug-fdwn9" event={"ID":"b3e56aaf-ce14-4fec-b042-d021787a4db3","Type":"ContainerDied","Data":"d6bae0cdd33ab14137a3613b4d01fadedcd9169d7d00646dbb9f918cf2141a42"} Sep 30 15:02:26 crc kubenswrapper[4936]: I0930 15:02:26.832017 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-764s8/crc-debug-fdwn9" Sep 30 15:02:26 crc kubenswrapper[4936]: I0930 15:02:26.878903 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-764s8/crc-debug-fdwn9"] Sep 30 15:02:26 crc kubenswrapper[4936]: I0930 15:02:26.887519 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-764s8/crc-debug-fdwn9"] Sep 30 15:02:26 crc kubenswrapper[4936]: I0930 15:02:26.968436 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsjmj\" (UniqueName: \"kubernetes.io/projected/b3e56aaf-ce14-4fec-b042-d021787a4db3-kube-api-access-gsjmj\") pod \"b3e56aaf-ce14-4fec-b042-d021787a4db3\" (UID: \"b3e56aaf-ce14-4fec-b042-d021787a4db3\") " Sep 30 15:02:26 crc kubenswrapper[4936]: I0930 15:02:26.968778 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3e56aaf-ce14-4fec-b042-d021787a4db3-host\") pod \"b3e56aaf-ce14-4fec-b042-d021787a4db3\" (UID: \"b3e56aaf-ce14-4fec-b042-d021787a4db3\") " Sep 30 15:02:26 crc kubenswrapper[4936]: I0930 15:02:26.974912 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3e56aaf-ce14-4fec-b042-d021787a4db3-kube-api-access-gsjmj" (OuterVolumeSpecName: "kube-api-access-gsjmj") pod "b3e56aaf-ce14-4fec-b042-d021787a4db3" (UID: "b3e56aaf-ce14-4fec-b042-d021787a4db3"). InnerVolumeSpecName "kube-api-access-gsjmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:02:26 crc kubenswrapper[4936]: I0930 15:02:26.975141 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b3e56aaf-ce14-4fec-b042-d021787a4db3-host" (OuterVolumeSpecName: "host") pod "b3e56aaf-ce14-4fec-b042-d021787a4db3" (UID: "b3e56aaf-ce14-4fec-b042-d021787a4db3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 15:02:27 crc kubenswrapper[4936]: I0930 15:02:27.070798 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsjmj\" (UniqueName: \"kubernetes.io/projected/b3e56aaf-ce14-4fec-b042-d021787a4db3-kube-api-access-gsjmj\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:27 crc kubenswrapper[4936]: I0930 15:02:27.070833 4936 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3e56aaf-ce14-4fec-b042-d021787a4db3-host\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:27 crc kubenswrapper[4936]: I0930 15:02:27.718301 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79f59c171fc98a27a3f46c102364f098eae62aeb9745b1fe8ac6b414addfe33b" Sep 30 15:02:27 crc kubenswrapper[4936]: I0930 15:02:27.718383 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-764s8/crc-debug-fdwn9" Sep 30 15:02:28 crc kubenswrapper[4936]: I0930 15:02:28.270468 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-764s8/crc-debug-rvb8l"] Sep 30 15:02:28 crc kubenswrapper[4936]: E0930 15:02:28.271045 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3e56aaf-ce14-4fec-b042-d021787a4db3" containerName="container-00" Sep 30 15:02:28 crc kubenswrapper[4936]: I0930 15:02:28.271057 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e56aaf-ce14-4fec-b042-d021787a4db3" containerName="container-00" Sep 30 15:02:28 crc kubenswrapper[4936]: E0930 15:02:28.271087 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab0e7875-c029-4f42-928e-34e1d276bb70" containerName="keystone-cron" Sep 30 15:02:28 crc kubenswrapper[4936]: I0930 15:02:28.271094 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab0e7875-c029-4f42-928e-34e1d276bb70" containerName="keystone-cron" Sep 30 15:02:28 crc kubenswrapper[4936]: I0930 15:02:28.271356 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3e56aaf-ce14-4fec-b042-d021787a4db3" containerName="container-00" Sep 30 15:02:28 crc kubenswrapper[4936]: I0930 15:02:28.271375 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab0e7875-c029-4f42-928e-34e1d276bb70" containerName="keystone-cron" Sep 30 15:02:28 crc kubenswrapper[4936]: I0930 15:02:28.272017 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-764s8/crc-debug-rvb8l" Sep 30 15:02:28 crc kubenswrapper[4936]: I0930 15:02:28.276172 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-764s8"/"default-dockercfg-rf594" Sep 30 15:02:28 crc kubenswrapper[4936]: I0930 15:02:28.318838 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:02:28 crc kubenswrapper[4936]: E0930 15:02:28.320493 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:02:28 crc kubenswrapper[4936]: I0930 15:02:28.347996 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3e56aaf-ce14-4fec-b042-d021787a4db3" path="/var/lib/kubelet/pods/b3e56aaf-ce14-4fec-b042-d021787a4db3/volumes" Sep 30 15:02:28 crc kubenswrapper[4936]: I0930 15:02:28.403463 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xc94\" (UniqueName: \"kubernetes.io/projected/9aed6b9d-4983-4d22-a394-75576eddfcd6-kube-api-access-6xc94\") pod \"crc-debug-rvb8l\" (UID: \"9aed6b9d-4983-4d22-a394-75576eddfcd6\") " pod="openshift-must-gather-764s8/crc-debug-rvb8l" Sep 30 15:02:28 crc kubenswrapper[4936]: I0930 15:02:28.403673 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9aed6b9d-4983-4d22-a394-75576eddfcd6-host\") pod \"crc-debug-rvb8l\" (UID: \"9aed6b9d-4983-4d22-a394-75576eddfcd6\") " pod="openshift-must-gather-764s8/crc-debug-rvb8l" Sep 30 15:02:28 crc kubenswrapper[4936]: I0930 15:02:28.506400 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9aed6b9d-4983-4d22-a394-75576eddfcd6-host\") pod \"crc-debug-rvb8l\" (UID: \"9aed6b9d-4983-4d22-a394-75576eddfcd6\") " pod="openshift-must-gather-764s8/crc-debug-rvb8l" Sep 30 15:02:28 crc kubenswrapper[4936]: I0930 15:02:28.506519 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xc94\" (UniqueName: \"kubernetes.io/projected/9aed6b9d-4983-4d22-a394-75576eddfcd6-kube-api-access-6xc94\") pod \"crc-debug-rvb8l\" (UID: \"9aed6b9d-4983-4d22-a394-75576eddfcd6\") " pod="openshift-must-gather-764s8/crc-debug-rvb8l" Sep 30 15:02:28 crc kubenswrapper[4936]: I0930 15:02:28.506863 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9aed6b9d-4983-4d22-a394-75576eddfcd6-host\") pod \"crc-debug-rvb8l\" (UID: \"9aed6b9d-4983-4d22-a394-75576eddfcd6\") " pod="openshift-must-gather-764s8/crc-debug-rvb8l" Sep 30 15:02:28 crc kubenswrapper[4936]: I0930 15:02:28.533649 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xc94\" (UniqueName: \"kubernetes.io/projected/9aed6b9d-4983-4d22-a394-75576eddfcd6-kube-api-access-6xc94\") pod \"crc-debug-rvb8l\" (UID: \"9aed6b9d-4983-4d22-a394-75576eddfcd6\") " pod="openshift-must-gather-764s8/crc-debug-rvb8l" Sep 30 15:02:28 crc kubenswrapper[4936]: I0930 15:02:28.604134 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-764s8/crc-debug-rvb8l" Sep 30 15:02:28 crc kubenswrapper[4936]: I0930 15:02:28.733806 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-764s8/crc-debug-rvb8l" event={"ID":"9aed6b9d-4983-4d22-a394-75576eddfcd6","Type":"ContainerStarted","Data":"e5e57ad5a234540e91d1f1e6fefe4e7ecf77bce55185b4aaaf567a7f599b6a09"} Sep 30 15:02:29 crc kubenswrapper[4936]: I0930 15:02:29.751135 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-764s8/crc-debug-rvb8l" event={"ID":"9aed6b9d-4983-4d22-a394-75576eddfcd6","Type":"ContainerStarted","Data":"5d23944b200b5de33b637225e59ee8a3f659d86a644a1256c76317f968eee672"} Sep 30 15:02:29 crc kubenswrapper[4936]: I0930 15:02:29.768214 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-764s8/crc-debug-rvb8l" podStartSLOduration=1.768197963 podStartE2EDuration="1.768197963s" podCreationTimestamp="2025-09-30 15:02:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:02:29.766597209 +0000 UTC m=+5000.150599510" watchObservedRunningTime="2025-09-30 15:02:29.768197963 +0000 UTC m=+5000.152200264" Sep 30 15:02:30 crc kubenswrapper[4936]: I0930 15:02:30.777452 4936 generic.go:334] "Generic (PLEG): container finished" podID="9aed6b9d-4983-4d22-a394-75576eddfcd6" containerID="5d23944b200b5de33b637225e59ee8a3f659d86a644a1256c76317f968eee672" exitCode=0 Sep 30 15:02:30 crc kubenswrapper[4936]: I0930 15:02:30.777527 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-764s8/crc-debug-rvb8l" event={"ID":"9aed6b9d-4983-4d22-a394-75576eddfcd6","Type":"ContainerDied","Data":"5d23944b200b5de33b637225e59ee8a3f659d86a644a1256c76317f968eee672"} Sep 30 15:02:32 crc kubenswrapper[4936]: I0930 15:02:32.078552 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-764s8/crc-debug-rvb8l" Sep 30 15:02:32 crc kubenswrapper[4936]: I0930 15:02:32.203642 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xc94\" (UniqueName: \"kubernetes.io/projected/9aed6b9d-4983-4d22-a394-75576eddfcd6-kube-api-access-6xc94\") pod \"9aed6b9d-4983-4d22-a394-75576eddfcd6\" (UID: \"9aed6b9d-4983-4d22-a394-75576eddfcd6\") " Sep 30 15:02:32 crc kubenswrapper[4936]: I0930 15:02:32.205495 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9aed6b9d-4983-4d22-a394-75576eddfcd6-host\") pod \"9aed6b9d-4983-4d22-a394-75576eddfcd6\" (UID: \"9aed6b9d-4983-4d22-a394-75576eddfcd6\") " Sep 30 15:02:32 crc kubenswrapper[4936]: I0930 15:02:32.206236 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9aed6b9d-4983-4d22-a394-75576eddfcd6-host" (OuterVolumeSpecName: "host") pod "9aed6b9d-4983-4d22-a394-75576eddfcd6" (UID: "9aed6b9d-4983-4d22-a394-75576eddfcd6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 15:02:32 crc kubenswrapper[4936]: I0930 15:02:32.209389 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aed6b9d-4983-4d22-a394-75576eddfcd6-kube-api-access-6xc94" (OuterVolumeSpecName: "kube-api-access-6xc94") pod "9aed6b9d-4983-4d22-a394-75576eddfcd6" (UID: "9aed6b9d-4983-4d22-a394-75576eddfcd6"). InnerVolumeSpecName "kube-api-access-6xc94". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:02:32 crc kubenswrapper[4936]: I0930 15:02:32.309798 4936 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9aed6b9d-4983-4d22-a394-75576eddfcd6-host\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:32 crc kubenswrapper[4936]: I0930 15:02:32.310772 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xc94\" (UniqueName: \"kubernetes.io/projected/9aed6b9d-4983-4d22-a394-75576eddfcd6-kube-api-access-6xc94\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:32 crc kubenswrapper[4936]: I0930 15:02:32.801219 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-764s8/crc-debug-rvb8l" event={"ID":"9aed6b9d-4983-4d22-a394-75576eddfcd6","Type":"ContainerDied","Data":"e5e57ad5a234540e91d1f1e6fefe4e7ecf77bce55185b4aaaf567a7f599b6a09"} Sep 30 15:02:32 crc kubenswrapper[4936]: I0930 15:02:32.801581 4936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5e57ad5a234540e91d1f1e6fefe4e7ecf77bce55185b4aaaf567a7f599b6a09" Sep 30 15:02:32 crc kubenswrapper[4936]: I0930 15:02:32.801402 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-764s8/crc-debug-rvb8l" Sep 30 15:02:35 crc kubenswrapper[4936]: I0930 15:02:35.296456 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_e3631d52-a6a9-46fc-b109-a8e0b96bac93/memcached/0.log" Sep 30 15:02:38 crc kubenswrapper[4936]: I0930 15:02:38.763869 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-764s8/crc-debug-rvb8l"] Sep 30 15:02:38 crc kubenswrapper[4936]: I0930 15:02:38.772006 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-764s8/crc-debug-rvb8l"] Sep 30 15:02:39 crc kubenswrapper[4936]: I0930 15:02:39.316249 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:02:39 crc kubenswrapper[4936]: E0930 15:02:39.316587 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:02:39 crc kubenswrapper[4936]: I0930 15:02:39.996246 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-764s8/crc-debug-8mgg8"] Sep 30 15:02:39 crc kubenswrapper[4936]: E0930 15:02:39.996911 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aed6b9d-4983-4d22-a394-75576eddfcd6" containerName="container-00" Sep 30 15:02:39 crc kubenswrapper[4936]: I0930 15:02:39.996923 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aed6b9d-4983-4d22-a394-75576eddfcd6" containerName="container-00" Sep 30 15:02:39 crc kubenswrapper[4936]: I0930 15:02:39.997092 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aed6b9d-4983-4d22-a394-75576eddfcd6" containerName="container-00" Sep 30 15:02:39 crc kubenswrapper[4936]: I0930 15:02:39.997706 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-764s8/crc-debug-8mgg8" Sep 30 15:02:40 crc kubenswrapper[4936]: I0930 15:02:40.008299 4936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-764s8"/"default-dockercfg-rf594" Sep 30 15:02:40 crc kubenswrapper[4936]: I0930 15:02:40.086900 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-889hf\" (UniqueName: \"kubernetes.io/projected/94648554-a89d-4421-af2d-0757174159b5-kube-api-access-889hf\") pod \"crc-debug-8mgg8\" (UID: \"94648554-a89d-4421-af2d-0757174159b5\") " pod="openshift-must-gather-764s8/crc-debug-8mgg8" Sep 30 15:02:40 crc kubenswrapper[4936]: I0930 15:02:40.087052 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/94648554-a89d-4421-af2d-0757174159b5-host\") pod \"crc-debug-8mgg8\" (UID: \"94648554-a89d-4421-af2d-0757174159b5\") " pod="openshift-must-gather-764s8/crc-debug-8mgg8" Sep 30 15:02:40 crc kubenswrapper[4936]: I0930 15:02:40.189785 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/94648554-a89d-4421-af2d-0757174159b5-host\") pod \"crc-debug-8mgg8\" (UID: \"94648554-a89d-4421-af2d-0757174159b5\") " pod="openshift-must-gather-764s8/crc-debug-8mgg8" Sep 30 15:02:40 crc kubenswrapper[4936]: I0930 15:02:40.189888 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-889hf\" (UniqueName: \"kubernetes.io/projected/94648554-a89d-4421-af2d-0757174159b5-kube-api-access-889hf\") pod \"crc-debug-8mgg8\" (UID: \"94648554-a89d-4421-af2d-0757174159b5\") " pod="openshift-must-gather-764s8/crc-debug-8mgg8" Sep 30 15:02:40 crc kubenswrapper[4936]: I0930 15:02:40.189973 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/94648554-a89d-4421-af2d-0757174159b5-host\") pod \"crc-debug-8mgg8\" (UID: \"94648554-a89d-4421-af2d-0757174159b5\") " pod="openshift-must-gather-764s8/crc-debug-8mgg8" Sep 30 15:02:40 crc kubenswrapper[4936]: I0930 15:02:40.211967 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-889hf\" (UniqueName: \"kubernetes.io/projected/94648554-a89d-4421-af2d-0757174159b5-kube-api-access-889hf\") pod \"crc-debug-8mgg8\" (UID: \"94648554-a89d-4421-af2d-0757174159b5\") " pod="openshift-must-gather-764s8/crc-debug-8mgg8" Sep 30 15:02:40 crc kubenswrapper[4936]: I0930 15:02:40.321855 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-764s8/crc-debug-8mgg8" Sep 30 15:02:40 crc kubenswrapper[4936]: I0930 15:02:40.328553 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aed6b9d-4983-4d22-a394-75576eddfcd6" path="/var/lib/kubelet/pods/9aed6b9d-4983-4d22-a394-75576eddfcd6/volumes" Sep 30 15:02:40 crc kubenswrapper[4936]: I0930 15:02:40.872765 4936 generic.go:334] "Generic (PLEG): container finished" podID="94648554-a89d-4421-af2d-0757174159b5" containerID="e7d4d8a24321d781265bfae626159785d462e9d315db6f17a693d1d58d98b662" exitCode=0 Sep 30 15:02:40 crc kubenswrapper[4936]: I0930 15:02:40.872972 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-764s8/crc-debug-8mgg8" event={"ID":"94648554-a89d-4421-af2d-0757174159b5","Type":"ContainerDied","Data":"e7d4d8a24321d781265bfae626159785d462e9d315db6f17a693d1d58d98b662"} Sep 30 15:02:40 crc kubenswrapper[4936]: I0930 15:02:40.873092 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-764s8/crc-debug-8mgg8" event={"ID":"94648554-a89d-4421-af2d-0757174159b5","Type":"ContainerStarted","Data":"4e3387daefdd96f829a0919c9c51dd5b91534349a22961213829dcdfc4fb7701"} Sep 30 15:02:40 crc kubenswrapper[4936]: I0930 15:02:40.917849 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-764s8/crc-debug-8mgg8"] Sep 30 15:02:40 crc kubenswrapper[4936]: I0930 15:02:40.926795 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-764s8/crc-debug-8mgg8"] Sep 30 15:02:42 crc kubenswrapper[4936]: I0930 15:02:42.070011 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-764s8/crc-debug-8mgg8" Sep 30 15:02:42 crc kubenswrapper[4936]: I0930 15:02:42.128529 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/94648554-a89d-4421-af2d-0757174159b5-host\") pod \"94648554-a89d-4421-af2d-0757174159b5\" (UID: \"94648554-a89d-4421-af2d-0757174159b5\") " Sep 30 15:02:42 crc kubenswrapper[4936]: I0930 15:02:42.128652 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-889hf\" (UniqueName: \"kubernetes.io/projected/94648554-a89d-4421-af2d-0757174159b5-kube-api-access-889hf\") pod \"94648554-a89d-4421-af2d-0757174159b5\" (UID: \"94648554-a89d-4421-af2d-0757174159b5\") " Sep 30 15:02:42 crc kubenswrapper[4936]: I0930 15:02:42.128659 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94648554-a89d-4421-af2d-0757174159b5-host" (OuterVolumeSpecName: "host") pod "94648554-a89d-4421-af2d-0757174159b5" (UID: "94648554-a89d-4421-af2d-0757174159b5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 15:02:42 crc kubenswrapper[4936]: I0930 15:02:42.129146 4936 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/94648554-a89d-4421-af2d-0757174159b5-host\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:42 crc kubenswrapper[4936]: I0930 15:02:42.143084 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94648554-a89d-4421-af2d-0757174159b5-kube-api-access-889hf" (OuterVolumeSpecName: "kube-api-access-889hf") pod "94648554-a89d-4421-af2d-0757174159b5" (UID: "94648554-a89d-4421-af2d-0757174159b5"). InnerVolumeSpecName "kube-api-access-889hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:02:42 crc kubenswrapper[4936]: I0930 15:02:42.231658 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-889hf\" (UniqueName: \"kubernetes.io/projected/94648554-a89d-4421-af2d-0757174159b5-kube-api-access-889hf\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:42 crc kubenswrapper[4936]: I0930 15:02:42.327472 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94648554-a89d-4421-af2d-0757174159b5" path="/var/lib/kubelet/pods/94648554-a89d-4421-af2d-0757174159b5/volumes" Sep 30 15:02:42 crc kubenswrapper[4936]: I0930 15:02:42.902033 4936 scope.go:117] "RemoveContainer" containerID="e7d4d8a24321d781265bfae626159785d462e9d315db6f17a693d1d58d98b662" Sep 30 15:02:42 crc kubenswrapper[4936]: I0930 15:02:42.902171 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-764s8/crc-debug-8mgg8" Sep 30 15:02:50 crc kubenswrapper[4936]: I0930 15:02:50.327352 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:02:50 crc kubenswrapper[4936]: E0930 15:02:50.328162 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:02:55 crc kubenswrapper[4936]: I0930 15:02:55.754614 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2_286303d8-20ec-45c4-86cf-3da1af48f329/util/0.log" Sep 30 15:02:55 crc kubenswrapper[4936]: I0930 15:02:55.918494 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2_286303d8-20ec-45c4-86cf-3da1af48f329/util/0.log" Sep 30 15:02:55 crc kubenswrapper[4936]: I0930 15:02:55.927812 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2_286303d8-20ec-45c4-86cf-3da1af48f329/pull/0.log" Sep 30 15:02:55 crc kubenswrapper[4936]: I0930 15:02:55.977700 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2_286303d8-20ec-45c4-86cf-3da1af48f329/pull/0.log" Sep 30 15:02:56 crc kubenswrapper[4936]: I0930 15:02:56.174814 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2_286303d8-20ec-45c4-86cf-3da1af48f329/pull/0.log" Sep 30 15:02:56 crc kubenswrapper[4936]: I0930 15:02:56.193607 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2_286303d8-20ec-45c4-86cf-3da1af48f329/util/0.log" Sep 30 15:02:56 crc kubenswrapper[4936]: I0930 15:02:56.234451 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6c4c94d9f6bd782fa3ccfa979db0beef516291132371833a20bf3eae1drc9l2_286303d8-20ec-45c4-86cf-3da1af48f329/extract/0.log" Sep 30 15:02:56 crc kubenswrapper[4936]: I0930 15:02:56.431827 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-x5g5r_650ff8e9-279f-41ff-8bb8-1880e7cf985c/kube-rbac-proxy/0.log" Sep 30 15:02:56 crc kubenswrapper[4936]: I0930 15:02:56.505265 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-x5g5r_650ff8e9-279f-41ff-8bb8-1880e7cf985c/manager/0.log" Sep 30 15:02:56 crc kubenswrapper[4936]: I0930 15:02:56.623179 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-6c8cz_7cc732ee-78f8-4d20-aac8-67ab10b944d3/kube-rbac-proxy/0.log" Sep 30 15:02:56 crc kubenswrapper[4936]: I0930 15:02:56.771498 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-6c8cz_7cc732ee-78f8-4d20-aac8-67ab10b944d3/manager/0.log" Sep 30 15:02:56 crc kubenswrapper[4936]: I0930 15:02:56.899988 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-xz9zg_9d8425ad-dcdc-4d31-9a5c-9461adb3296c/kube-rbac-proxy/0.log" Sep 30 15:02:56 crc kubenswrapper[4936]: I0930 15:02:56.918351 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-xz9zg_9d8425ad-dcdc-4d31-9a5c-9461adb3296c/manager/0.log" Sep 30 15:02:57 crc kubenswrapper[4936]: I0930 15:02:57.096915 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-fk2zk_bd9be0ef-9048-4e0a-b8d7-1b29b450984f/kube-rbac-proxy/0.log" Sep 30 15:02:57 crc kubenswrapper[4936]: I0930 15:02:57.214104 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-fk2zk_bd9be0ef-9048-4e0a-b8d7-1b29b450984f/manager/0.log" Sep 30 15:02:57 crc kubenswrapper[4936]: I0930 15:02:57.285675 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-g5k7h_f632d83e-2c2c-4c90-8fea-5747d58633d6/kube-rbac-proxy/0.log" Sep 30 15:02:57 crc kubenswrapper[4936]: I0930 15:02:57.381497 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-g5k7h_f632d83e-2c2c-4c90-8fea-5747d58633d6/manager/0.log" Sep 30 15:02:57 crc kubenswrapper[4936]: I0930 15:02:57.471699 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-9jjnj_13bd8563-ccb1-4445-b613-495e801195a4/kube-rbac-proxy/0.log" Sep 30 15:02:57 crc kubenswrapper[4936]: I0930 15:02:57.604282 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-9jjnj_13bd8563-ccb1-4445-b613-495e801195a4/manager/0.log" Sep 30 15:02:57 crc kubenswrapper[4936]: I0930 15:02:57.749851 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-hwpcz_7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6/kube-rbac-proxy/0.log" Sep 30 15:02:57 crc kubenswrapper[4936]: I0930 15:02:57.835565 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-hwpcz_7f5f8f3d-7d7a-4cc6-acbf-f47049f0f5b6/manager/0.log" Sep 30 15:02:57 crc kubenswrapper[4936]: I0930 15:02:57.951837 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-n28ht_189d95a0-9ee0-4055-86c2-724082d46a11/kube-rbac-proxy/0.log" Sep 30 15:02:58 crc kubenswrapper[4936]: I0930 15:02:58.103616 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-n28ht_189d95a0-9ee0-4055-86c2-724082d46a11/manager/0.log" Sep 30 15:02:58 crc kubenswrapper[4936]: I0930 15:02:58.162169 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-hgdlx_c54f70b2-5767-4616-878b-5816861d2637/kube-rbac-proxy/0.log" Sep 30 15:02:58 crc kubenswrapper[4936]: I0930 15:02:58.259306 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-hgdlx_c54f70b2-5767-4616-878b-5816861d2637/manager/0.log" Sep 30 15:02:58 crc kubenswrapper[4936]: I0930 15:02:58.427905 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-sxftv_d08bae3c-64f1-46de-ab2c-d6b2407c2d95/manager/0.log" Sep 30 15:02:58 crc kubenswrapper[4936]: I0930 15:02:58.429948 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-sxftv_d08bae3c-64f1-46de-ab2c-d6b2407c2d95/kube-rbac-proxy/0.log" Sep 30 15:02:58 crc kubenswrapper[4936]: I0930 15:02:58.671807 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-8tpqq_847b4871-2d23-4790-b32a-b42698008fee/kube-rbac-proxy/0.log" Sep 30 15:02:58 crc kubenswrapper[4936]: I0930 15:02:58.678656 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-8tpqq_847b4871-2d23-4790-b32a-b42698008fee/manager/0.log" Sep 30 15:02:58 crc kubenswrapper[4936]: I0930 15:02:58.859121 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-2kldb_e546e1bb-9ee4-4549-9521-76d122b4edf5/kube-rbac-proxy/0.log" Sep 30 15:02:58 crc kubenswrapper[4936]: I0930 15:02:58.968585 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-2kldb_e546e1bb-9ee4-4549-9521-76d122b4edf5/manager/0.log" Sep 30 15:02:59 crc kubenswrapper[4936]: I0930 15:02:59.022787 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-zvzfj_5ec29297-f0db-497d-aa05-e939e9aef380/kube-rbac-proxy/0.log" Sep 30 15:02:59 crc kubenswrapper[4936]: I0930 15:02:59.190621 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-zvzfj_5ec29297-f0db-497d-aa05-e939e9aef380/manager/0.log" Sep 30 15:02:59 crc kubenswrapper[4936]: I0930 15:02:59.270085 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-lhbld_d50d2534-deec-4173-a73a-d10b3beac452/kube-rbac-proxy/0.log" Sep 30 15:02:59 crc kubenswrapper[4936]: I0930 15:02:59.316789 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-lhbld_d50d2534-deec-4173-a73a-d10b3beac452/manager/0.log" Sep 30 15:02:59 crc kubenswrapper[4936]: I0930 15:02:59.555683 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-5mdhl_b1538e13-4b0e-4bb9-9277-3d0475cd41a4/kube-rbac-proxy/0.log" Sep 30 15:02:59 crc kubenswrapper[4936]: I0930 15:02:59.669912 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-5mdhl_b1538e13-4b0e-4bb9-9277-3d0475cd41a4/manager/0.log" Sep 30 15:02:59 crc kubenswrapper[4936]: I0930 15:02:59.801529 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-64c94699b9-lqnhn_b4e378e0-0a69-47c9-b80f-fee159c4ad5b/kube-rbac-proxy/0.log" Sep 30 15:03:00 crc kubenswrapper[4936]: I0930 15:03:00.019258 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-69769bbb6-9mvrz_4ea53f04-2776-4e45-9444-6255d7fd2860/kube-rbac-proxy/0.log" Sep 30 15:03:00 crc kubenswrapper[4936]: I0930 15:03:00.264132 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-69769bbb6-9mvrz_4ea53f04-2776-4e45-9444-6255d7fd2860/operator/0.log" Sep 30 15:03:00 crc kubenswrapper[4936]: I0930 15:03:00.289745 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-b5lzr_6fc87ca3-ce0e-4976-b45f-cf28709a6f9f/registry-server/0.log" Sep 30 15:03:00 crc kubenswrapper[4936]: I0930 15:03:00.947853 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-sthwj_85448d25-86e9-4a2f-bc5a-339ab3d2112a/kube-rbac-proxy/0.log" Sep 30 15:03:01 crc kubenswrapper[4936]: I0930 15:03:01.009140 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-sxfzz_699f5243-7ea5-4f7f-a537-51a99a871ccb/kube-rbac-proxy/0.log" Sep 30 15:03:01 crc kubenswrapper[4936]: I0930 15:03:01.032612 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-sthwj_85448d25-86e9-4a2f-bc5a-339ab3d2112a/manager/0.log" Sep 30 15:03:01 crc kubenswrapper[4936]: I0930 15:03:01.188404 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-sxfzz_699f5243-7ea5-4f7f-a537-51a99a871ccb/manager/0.log" Sep 30 15:03:01 crc kubenswrapper[4936]: I0930 15:03:01.256398 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-64c94699b9-lqnhn_b4e378e0-0a69-47c9-b80f-fee159c4ad5b/manager/0.log" Sep 30 15:03:01 crc kubenswrapper[4936]: I0930 15:03:01.342922 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-49bq2_d93cfe4f-caf4-4b23-9d9c-0aa14cb5bc28/operator/0.log" Sep 30 15:03:01 crc kubenswrapper[4936]: I0930 15:03:01.434447 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-n6mv9_bbbf4ed1-241b-4c4e-80e0-77acd778b868/kube-rbac-proxy/0.log" Sep 30 15:03:01 crc kubenswrapper[4936]: I0930 15:03:01.459500 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-n6mv9_bbbf4ed1-241b-4c4e-80e0-77acd778b868/manager/0.log" Sep 30 15:03:01 crc kubenswrapper[4936]: I0930 15:03:01.528138 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-7rfmc_78f55939-d8fc-40d5-bc8e-a3f87b962b34/kube-rbac-proxy/0.log" Sep 30 15:03:01 crc kubenswrapper[4936]: I0930 15:03:01.666406 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-7rfmc_78f55939-d8fc-40d5-bc8e-a3f87b962b34/manager/0.log" Sep 30 15:03:01 crc kubenswrapper[4936]: I0930 15:03:01.697292 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-x2csq_673847ae-740d-4a3b-ad7e-09ec8848199d/manager/0.log" Sep 30 15:03:01 crc kubenswrapper[4936]: I0930 15:03:01.780011 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-x2csq_673847ae-740d-4a3b-ad7e-09ec8848199d/kube-rbac-proxy/0.log" Sep 30 15:03:01 crc kubenswrapper[4936]: I0930 15:03:01.878456 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-d2tm5_639a60da-010a-40d5-bfec-6219ef3f712b/kube-rbac-proxy/0.log" Sep 30 15:03:01 crc kubenswrapper[4936]: I0930 15:03:01.908377 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-d2tm5_639a60da-010a-40d5-bfec-6219ef3f712b/manager/0.log" Sep 30 15:03:04 crc kubenswrapper[4936]: I0930 15:03:04.315760 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:03:04 crc kubenswrapper[4936]: E0930 15:03:04.316415 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:03:18 crc kubenswrapper[4936]: I0930 15:03:18.706701 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-6mxph_6d8ca06e-ea42-4678-8d28-dcd11b4dd1ce/control-plane-machine-set-operator/0.log" Sep 30 15:03:19 crc kubenswrapper[4936]: I0930 15:03:19.209554 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-rw728_27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30/kube-rbac-proxy/0.log" Sep 30 15:03:19 crc kubenswrapper[4936]: I0930 15:03:19.236010 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-rw728_27e3e6c9-3a6f-4f01-8fd6-8801b73f2b30/machine-api-operator/0.log" Sep 30 15:03:19 crc kubenswrapper[4936]: I0930 15:03:19.316413 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:03:19 crc kubenswrapper[4936]: E0930 15:03:19.320511 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:03:31 crc kubenswrapper[4936]: I0930 15:03:31.315146 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:03:31 crc kubenswrapper[4936]: E0930 15:03:31.316036 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:03:31 crc kubenswrapper[4936]: I0930 15:03:31.913596 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-wm65j_392d0573-afef-4492-9768-2d9c4830d7b8/cert-manager-controller/0.log" Sep 30 15:03:32 crc kubenswrapper[4936]: I0930 15:03:32.072952 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-xwsfb_b59b1137-6114-4d73-8593-250d0da0b741/cert-manager-cainjector/0.log" Sep 30 15:03:32 crc kubenswrapper[4936]: I0930 15:03:32.127087 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-z76zc_a9648b53-2a15-447e-bca5-87692ab32278/cert-manager-webhook/0.log" Sep 30 15:03:44 crc kubenswrapper[4936]: I0930 15:03:44.343067 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-ngb5v_f690b866-399e-4bf7-bed4-c261098bfbb1/nmstate-console-plugin/0.log" Sep 30 15:03:44 crc kubenswrapper[4936]: I0930 15:03:44.585108 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-lgvdl_6b791ba1-37e6-440f-899d-7db4972b74f5/nmstate-handler/0.log" Sep 30 15:03:44 crc kubenswrapper[4936]: I0930 15:03:44.596553 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-gj98k_e80b68f2-d68e-4499-84a4-8a83b18922c6/kube-rbac-proxy/0.log" Sep 30 15:03:44 crc kubenswrapper[4936]: I0930 15:03:44.663925 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-gj98k_e80b68f2-d68e-4499-84a4-8a83b18922c6/nmstate-metrics/0.log" Sep 30 15:03:44 crc kubenswrapper[4936]: I0930 15:03:44.840935 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-cln2v_4409f314-97f6-4776-980f-c7727fa7fd18/nmstate-operator/0.log" Sep 30 15:03:44 crc kubenswrapper[4936]: I0930 15:03:44.859201 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-284mm_50cdc3bf-7d9e-4644-87b2-81e93c15174a/nmstate-webhook/0.log" Sep 30 15:03:46 crc kubenswrapper[4936]: I0930 15:03:46.315187 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:03:46 crc kubenswrapper[4936]: E0930 15:03:46.317019 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:03:59 crc kubenswrapper[4936]: I0930 15:03:59.745155 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-pkkdt_63b433c1-ca17-4e41-9412-8c9abede7b39/kube-rbac-proxy/0.log" Sep 30 15:03:59 crc kubenswrapper[4936]: I0930 15:03:59.880729 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-pkkdt_63b433c1-ca17-4e41-9412-8c9abede7b39/controller/0.log" Sep 30 15:03:59 crc kubenswrapper[4936]: I0930 15:03:59.929752 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-frr-files/0.log" Sep 30 15:04:00 crc kubenswrapper[4936]: I0930 15:04:00.153874 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-metrics/0.log" Sep 30 15:04:00 crc kubenswrapper[4936]: I0930 15:04:00.183551 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-reloader/0.log" Sep 30 15:04:00 crc kubenswrapper[4936]: I0930 15:04:00.188913 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-frr-files/0.log" Sep 30 15:04:00 crc kubenswrapper[4936]: I0930 15:04:00.198292 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-reloader/0.log" Sep 30 15:04:00 crc kubenswrapper[4936]: I0930 15:04:00.470137 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-metrics/0.log" Sep 30 15:04:00 crc kubenswrapper[4936]: I0930 15:04:00.476185 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-metrics/0.log" Sep 30 15:04:00 crc kubenswrapper[4936]: I0930 15:04:00.480020 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-reloader/0.log" Sep 30 15:04:00 crc kubenswrapper[4936]: I0930 15:04:00.483873 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-frr-files/0.log" Sep 30 15:04:00 crc kubenswrapper[4936]: I0930 15:04:00.690768 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-metrics/0.log" Sep 30 15:04:00 crc kubenswrapper[4936]: I0930 15:04:00.692358 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-reloader/0.log" Sep 30 15:04:00 crc kubenswrapper[4936]: I0930 15:04:00.709695 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/cp-frr-files/0.log" Sep 30 15:04:00 crc kubenswrapper[4936]: I0930 15:04:00.714525 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/controller/0.log" Sep 30 15:04:00 crc kubenswrapper[4936]: I0930 15:04:00.879886 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/frr-metrics/0.log" Sep 30 15:04:00 crc kubenswrapper[4936]: I0930 15:04:00.959183 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/kube-rbac-proxy-frr/0.log" Sep 30 15:04:01 crc kubenswrapper[4936]: I0930 15:04:01.026290 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/kube-rbac-proxy/0.log" Sep 30 15:04:01 crc kubenswrapper[4936]: I0930 15:04:01.192625 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/reloader/0.log" Sep 30 15:04:01 crc kubenswrapper[4936]: I0930 15:04:01.315633 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:04:01 crc kubenswrapper[4936]: E0930 15:04:01.315958 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:04:01 crc kubenswrapper[4936]: I0930 15:04:01.334478 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-chnlb_f304af2d-f6f0-4be9-8388-a81870af995f/frr-k8s-webhook-server/0.log" Sep 30 15:04:01 crc kubenswrapper[4936]: I0930 15:04:01.650209 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6fd76b6558-4gwpj_d4775401-fba2-4958-b075-6862db490e18/manager/0.log" Sep 30 15:04:01 crc kubenswrapper[4936]: I0930 15:04:01.734371 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-c67dfbd86-j4s2n_ab06cf8d-01c1-45c8-9c95-6f3369b8ef75/webhook-server/0.log" Sep 30 15:04:02 crc kubenswrapper[4936]: I0930 15:04:02.165637 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-24z6s_f673a383-44a6-4fe9-a432-f84341817e89/kube-rbac-proxy/0.log" Sep 30 15:04:02 crc kubenswrapper[4936]: I0930 15:04:02.461398 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j947b_55e7d0de-bf31-4644-958e-33d7fe7c696b/frr/0.log" Sep 30 15:04:02 crc kubenswrapper[4936]: I0930 15:04:02.646850 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-24z6s_f673a383-44a6-4fe9-a432-f84341817e89/speaker/0.log" Sep 30 15:04:13 crc kubenswrapper[4936]: I0930 15:04:13.315242 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:04:13 crc kubenswrapper[4936]: E0930 15:04:13.316172 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:04:15 crc kubenswrapper[4936]: I0930 15:04:15.325892 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76_9ece5e6c-214d-460e-bedf-19196f994946/util/0.log" Sep 30 15:04:15 crc kubenswrapper[4936]: I0930 15:04:15.560840 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76_9ece5e6c-214d-460e-bedf-19196f994946/util/0.log" Sep 30 15:04:15 crc kubenswrapper[4936]: I0930 15:04:15.641317 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76_9ece5e6c-214d-460e-bedf-19196f994946/pull/0.log" Sep 30 15:04:15 crc kubenswrapper[4936]: I0930 15:04:15.641368 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76_9ece5e6c-214d-460e-bedf-19196f994946/pull/0.log" Sep 30 15:04:15 crc kubenswrapper[4936]: I0930 15:04:15.824215 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76_9ece5e6c-214d-460e-bedf-19196f994946/util/0.log" Sep 30 15:04:15 crc kubenswrapper[4936]: I0930 15:04:15.847362 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76_9ece5e6c-214d-460e-bedf-19196f994946/pull/0.log" Sep 30 15:04:15 crc kubenswrapper[4936]: I0930 15:04:15.895485 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bctsq76_9ece5e6c-214d-460e-bedf-19196f994946/extract/0.log" Sep 30 15:04:16 crc kubenswrapper[4936]: I0930 15:04:16.062771 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6hgz_2ca06e96-23c1-4b90-a643-3e36b8df9443/extract-utilities/0.log" Sep 30 15:04:16 crc kubenswrapper[4936]: I0930 15:04:16.253369 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6hgz_2ca06e96-23c1-4b90-a643-3e36b8df9443/extract-content/0.log" Sep 30 15:04:16 crc kubenswrapper[4936]: I0930 15:04:16.293309 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6hgz_2ca06e96-23c1-4b90-a643-3e36b8df9443/extract-content/0.log" Sep 30 15:04:16 crc kubenswrapper[4936]: I0930 15:04:16.321895 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6hgz_2ca06e96-23c1-4b90-a643-3e36b8df9443/extract-utilities/0.log" Sep 30 15:04:16 crc kubenswrapper[4936]: I0930 15:04:16.469785 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6hgz_2ca06e96-23c1-4b90-a643-3e36b8df9443/extract-content/0.log" Sep 30 15:04:16 crc kubenswrapper[4936]: I0930 15:04:16.517025 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6hgz_2ca06e96-23c1-4b90-a643-3e36b8df9443/extract-utilities/0.log" Sep 30 15:04:16 crc kubenswrapper[4936]: I0930 15:04:16.898585 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kljrf_5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0/extract-utilities/0.log" Sep 30 15:04:17 crc kubenswrapper[4936]: I0930 15:04:17.076954 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kljrf_5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0/extract-content/0.log" Sep 30 15:04:17 crc kubenswrapper[4936]: I0930 15:04:17.082121 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f6hgz_2ca06e96-23c1-4b90-a643-3e36b8df9443/registry-server/0.log" Sep 30 15:04:17 crc kubenswrapper[4936]: I0930 15:04:17.108289 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kljrf_5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0/extract-utilities/0.log" Sep 30 15:04:17 crc kubenswrapper[4936]: I0930 15:04:17.184225 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kljrf_5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0/extract-content/0.log" Sep 30 15:04:17 crc kubenswrapper[4936]: I0930 15:04:17.347501 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kljrf_5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0/extract-content/0.log" Sep 30 15:04:17 crc kubenswrapper[4936]: I0930 15:04:17.387214 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kljrf_5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0/extract-utilities/0.log" Sep 30 15:04:17 crc kubenswrapper[4936]: I0930 15:04:17.626911 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb_2b7261a1-f326-4692-ac33-cef53002b4eb/util/0.log" Sep 30 15:04:17 crc kubenswrapper[4936]: I0930 15:04:17.940988 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb_2b7261a1-f326-4692-ac33-cef53002b4eb/util/0.log" Sep 30 15:04:17 crc kubenswrapper[4936]: I0930 15:04:17.941873 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb_2b7261a1-f326-4692-ac33-cef53002b4eb/pull/0.log" Sep 30 15:04:18 crc kubenswrapper[4936]: I0930 15:04:18.028128 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb_2b7261a1-f326-4692-ac33-cef53002b4eb/pull/0.log" Sep 30 15:04:18 crc kubenswrapper[4936]: I0930 15:04:18.269309 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kljrf_5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0/registry-server/0.log" Sep 30 15:04:18 crc kubenswrapper[4936]: I0930 15:04:18.313953 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb_2b7261a1-f326-4692-ac33-cef53002b4eb/extract/0.log" Sep 30 15:04:18 crc kubenswrapper[4936]: I0930 15:04:18.321117 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb_2b7261a1-f326-4692-ac33-cef53002b4eb/util/0.log" Sep 30 15:04:18 crc kubenswrapper[4936]: I0930 15:04:18.357487 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96wj9bb_2b7261a1-f326-4692-ac33-cef53002b4eb/pull/0.log" Sep 30 15:04:18 crc kubenswrapper[4936]: I0930 15:04:18.528226 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-q587x_18fdb3dd-ed9e-4625-9bb8-7f2a079396dd/marketplace-operator/0.log" Sep 30 15:04:18 crc kubenswrapper[4936]: I0930 15:04:18.541691 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f4nkx_e48746c3-7005-4672-a536-f6b419f168fc/extract-utilities/0.log" Sep 30 15:04:18 crc kubenswrapper[4936]: I0930 15:04:18.763060 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f4nkx_e48746c3-7005-4672-a536-f6b419f168fc/extract-utilities/0.log" Sep 30 15:04:18 crc kubenswrapper[4936]: I0930 15:04:18.831197 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f4nkx_e48746c3-7005-4672-a536-f6b419f168fc/extract-content/0.log" Sep 30 15:04:18 crc kubenswrapper[4936]: I0930 15:04:18.856260 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f4nkx_e48746c3-7005-4672-a536-f6b419f168fc/extract-content/0.log" Sep 30 15:04:19 crc kubenswrapper[4936]: I0930 15:04:19.030865 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f4nkx_e48746c3-7005-4672-a536-f6b419f168fc/extract-utilities/0.log" Sep 30 15:04:19 crc kubenswrapper[4936]: I0930 15:04:19.048586 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f4nkx_e48746c3-7005-4672-a536-f6b419f168fc/extract-content/0.log" Sep 30 15:04:19 crc kubenswrapper[4936]: I0930 15:04:19.282859 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f4nkx_e48746c3-7005-4672-a536-f6b419f168fc/registry-server/0.log" Sep 30 15:04:19 crc kubenswrapper[4936]: I0930 15:04:19.353759 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k8t8h_d779fd94-9fb2-4bd0-95c3-d7ac8885b589/extract-utilities/0.log" Sep 30 15:04:19 crc kubenswrapper[4936]: I0930 15:04:19.484905 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k8t8h_d779fd94-9fb2-4bd0-95c3-d7ac8885b589/extract-utilities/0.log" Sep 30 15:04:19 crc kubenswrapper[4936]: I0930 15:04:19.521674 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k8t8h_d779fd94-9fb2-4bd0-95c3-d7ac8885b589/extract-content/0.log" Sep 30 15:04:19 crc kubenswrapper[4936]: I0930 15:04:19.575958 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k8t8h_d779fd94-9fb2-4bd0-95c3-d7ac8885b589/extract-content/0.log" Sep 30 15:04:19 crc kubenswrapper[4936]: I0930 15:04:19.718312 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k8t8h_d779fd94-9fb2-4bd0-95c3-d7ac8885b589/extract-utilities/0.log" Sep 30 15:04:19 crc kubenswrapper[4936]: I0930 15:04:19.723515 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k8t8h_d779fd94-9fb2-4bd0-95c3-d7ac8885b589/extract-content/0.log" Sep 30 15:04:20 crc kubenswrapper[4936]: I0930 15:04:20.134117 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k8t8h_d779fd94-9fb2-4bd0-95c3-d7ac8885b589/registry-server/0.log" Sep 30 15:04:25 crc kubenswrapper[4936]: I0930 15:04:25.316482 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:04:25 crc kubenswrapper[4936]: E0930 15:04:25.317358 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:04:40 crc kubenswrapper[4936]: I0930 15:04:40.324198 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:04:40 crc kubenswrapper[4936]: E0930 15:04:40.326214 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:04:54 crc kubenswrapper[4936]: I0930 15:04:54.315465 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:04:54 crc kubenswrapper[4936]: E0930 15:04:54.317483 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:05:06 crc kubenswrapper[4936]: I0930 15:05:06.316429 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:05:06 crc kubenswrapper[4936]: E0930 15:05:06.317177 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:05:19 crc kubenswrapper[4936]: I0930 15:05:19.315217 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:05:19 crc kubenswrapper[4936]: E0930 15:05:19.316822 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:05:30 crc kubenswrapper[4936]: I0930 15:05:30.322729 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:05:30 crc kubenswrapper[4936]: E0930 15:05:30.323580 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:05:40 crc kubenswrapper[4936]: I0930 15:05:40.872600 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b8kr8"] Sep 30 15:05:40 crc kubenswrapper[4936]: E0930 15:05:40.873873 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94648554-a89d-4421-af2d-0757174159b5" containerName="container-00" Sep 30 15:05:40 crc kubenswrapper[4936]: I0930 15:05:40.873895 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="94648554-a89d-4421-af2d-0757174159b5" containerName="container-00" Sep 30 15:05:40 crc kubenswrapper[4936]: I0930 15:05:40.874157 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="94648554-a89d-4421-af2d-0757174159b5" containerName="container-00" Sep 30 15:05:40 crc kubenswrapper[4936]: I0930 15:05:40.876049 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b8kr8" Sep 30 15:05:40 crc kubenswrapper[4936]: I0930 15:05:40.891433 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b8kr8"] Sep 30 15:05:40 crc kubenswrapper[4936]: I0930 15:05:40.941702 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ade3de07-72c3-40d5-982a-1e75c734d739-catalog-content\") pod \"community-operators-b8kr8\" (UID: \"ade3de07-72c3-40d5-982a-1e75c734d739\") " pod="openshift-marketplace/community-operators-b8kr8" Sep 30 15:05:40 crc kubenswrapper[4936]: I0930 15:05:40.941899 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ade3de07-72c3-40d5-982a-1e75c734d739-utilities\") pod \"community-operators-b8kr8\" (UID: \"ade3de07-72c3-40d5-982a-1e75c734d739\") " pod="openshift-marketplace/community-operators-b8kr8" Sep 30 15:05:40 crc kubenswrapper[4936]: I0930 15:05:40.941979 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsqks\" (UniqueName: \"kubernetes.io/projected/ade3de07-72c3-40d5-982a-1e75c734d739-kube-api-access-xsqks\") pod \"community-operators-b8kr8\" (UID: \"ade3de07-72c3-40d5-982a-1e75c734d739\") " pod="openshift-marketplace/community-operators-b8kr8" Sep 30 15:05:41 crc kubenswrapper[4936]: I0930 15:05:41.044298 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ade3de07-72c3-40d5-982a-1e75c734d739-utilities\") pod \"community-operators-b8kr8\" (UID: \"ade3de07-72c3-40d5-982a-1e75c734d739\") " pod="openshift-marketplace/community-operators-b8kr8" Sep 30 15:05:41 crc kubenswrapper[4936]: I0930 15:05:41.044391 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsqks\" (UniqueName: \"kubernetes.io/projected/ade3de07-72c3-40d5-982a-1e75c734d739-kube-api-access-xsqks\") pod \"community-operators-b8kr8\" (UID: \"ade3de07-72c3-40d5-982a-1e75c734d739\") " pod="openshift-marketplace/community-operators-b8kr8" Sep 30 15:05:41 crc kubenswrapper[4936]: I0930 15:05:41.044470 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ade3de07-72c3-40d5-982a-1e75c734d739-catalog-content\") pod \"community-operators-b8kr8\" (UID: \"ade3de07-72c3-40d5-982a-1e75c734d739\") " pod="openshift-marketplace/community-operators-b8kr8" Sep 30 15:05:41 crc kubenswrapper[4936]: I0930 15:05:41.044917 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ade3de07-72c3-40d5-982a-1e75c734d739-catalog-content\") pod \"community-operators-b8kr8\" (UID: \"ade3de07-72c3-40d5-982a-1e75c734d739\") " pod="openshift-marketplace/community-operators-b8kr8" Sep 30 15:05:41 crc kubenswrapper[4936]: I0930 15:05:41.045122 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ade3de07-72c3-40d5-982a-1e75c734d739-utilities\") pod \"community-operators-b8kr8\" (UID: \"ade3de07-72c3-40d5-982a-1e75c734d739\") " pod="openshift-marketplace/community-operators-b8kr8" Sep 30 15:05:41 crc kubenswrapper[4936]: I0930 15:05:41.083854 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsqks\" (UniqueName: \"kubernetes.io/projected/ade3de07-72c3-40d5-982a-1e75c734d739-kube-api-access-xsqks\") pod \"community-operators-b8kr8\" (UID: \"ade3de07-72c3-40d5-982a-1e75c734d739\") " pod="openshift-marketplace/community-operators-b8kr8" Sep 30 15:05:41 crc kubenswrapper[4936]: I0930 15:05:41.201486 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b8kr8" Sep 30 15:05:41 crc kubenswrapper[4936]: I0930 15:05:41.661313 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b8kr8"] Sep 30 15:05:42 crc kubenswrapper[4936]: I0930 15:05:42.590776 4936 generic.go:334] "Generic (PLEG): container finished" podID="ade3de07-72c3-40d5-982a-1e75c734d739" containerID="4a06594b45b479ae370390a6b137dbade548d6d2510cfa862005ae8568fde9b9" exitCode=0 Sep 30 15:05:42 crc kubenswrapper[4936]: I0930 15:05:42.591361 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8kr8" event={"ID":"ade3de07-72c3-40d5-982a-1e75c734d739","Type":"ContainerDied","Data":"4a06594b45b479ae370390a6b137dbade548d6d2510cfa862005ae8568fde9b9"} Sep 30 15:05:42 crc kubenswrapper[4936]: I0930 15:05:42.591395 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8kr8" event={"ID":"ade3de07-72c3-40d5-982a-1e75c734d739","Type":"ContainerStarted","Data":"f79078c53961c59f4befeea475caffacb8c659fdb260e952d58bb703ab65e79a"} Sep 30 15:05:42 crc kubenswrapper[4936]: I0930 15:05:42.594194 4936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 15:05:43 crc kubenswrapper[4936]: I0930 15:05:43.316112 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:05:43 crc kubenswrapper[4936]: E0930 15:05:43.316309 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:05:48 crc kubenswrapper[4936]: I0930 15:05:48.647677 4936 generic.go:334] "Generic (PLEG): container finished" podID="ade3de07-72c3-40d5-982a-1e75c734d739" containerID="0868994ad4bf24959952dba1a4a00dd632366073c463dd79518fe62f133f21a3" exitCode=0 Sep 30 15:05:48 crc kubenswrapper[4936]: I0930 15:05:48.647805 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8kr8" event={"ID":"ade3de07-72c3-40d5-982a-1e75c734d739","Type":"ContainerDied","Data":"0868994ad4bf24959952dba1a4a00dd632366073c463dd79518fe62f133f21a3"} Sep 30 15:05:49 crc kubenswrapper[4936]: I0930 15:05:49.658659 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8kr8" event={"ID":"ade3de07-72c3-40d5-982a-1e75c734d739","Type":"ContainerStarted","Data":"c1cc133f5dfbb785f5ee9cefcace67d4ce160fd7cd36fda694b58711d6500221"} Sep 30 15:05:49 crc kubenswrapper[4936]: I0930 15:05:49.685015 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b8kr8" podStartSLOduration=3.135692198 podStartE2EDuration="9.684996196s" podCreationTimestamp="2025-09-30 15:05:40 +0000 UTC" firstStartedPulling="2025-09-30 15:05:42.593934991 +0000 UTC m=+5192.977937292" lastFinishedPulling="2025-09-30 15:05:49.143238989 +0000 UTC m=+5199.527241290" observedRunningTime="2025-09-30 15:05:49.676574115 +0000 UTC m=+5200.060576436" watchObservedRunningTime="2025-09-30 15:05:49.684996196 +0000 UTC m=+5200.068998497" Sep 30 15:05:51 crc kubenswrapper[4936]: I0930 15:05:51.202392 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b8kr8" Sep 30 15:05:51 crc kubenswrapper[4936]: I0930 15:05:51.202613 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b8kr8" Sep 30 15:05:52 crc kubenswrapper[4936]: I0930 15:05:52.250169 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-b8kr8" podUID="ade3de07-72c3-40d5-982a-1e75c734d739" containerName="registry-server" probeResult="failure" output=< Sep 30 15:05:52 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 15:05:52 crc kubenswrapper[4936]: > Sep 30 15:05:56 crc kubenswrapper[4936]: I0930 15:05:56.315605 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:05:56 crc kubenswrapper[4936]: E0930 15:05:56.317760 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:06:01 crc kubenswrapper[4936]: I0930 15:06:01.249560 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b8kr8" Sep 30 15:06:01 crc kubenswrapper[4936]: I0930 15:06:01.302131 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b8kr8" Sep 30 15:06:01 crc kubenswrapper[4936]: I0930 15:06:01.437742 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b8kr8"] Sep 30 15:06:01 crc kubenswrapper[4936]: I0930 15:06:01.501137 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kljrf"] Sep 30 15:06:01 crc kubenswrapper[4936]: I0930 15:06:01.501859 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kljrf" podUID="5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0" containerName="registry-server" containerID="cri-o://08aaed2abf42edbc93f1633b3174930379ed5c15d7827c5159ecf31a45f1dfd2" gracePeriod=2 Sep 30 15:06:01 crc kubenswrapper[4936]: I0930 15:06:01.779727 4936 generic.go:334] "Generic (PLEG): container finished" podID="5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0" containerID="08aaed2abf42edbc93f1633b3174930379ed5c15d7827c5159ecf31a45f1dfd2" exitCode=0 Sep 30 15:06:01 crc kubenswrapper[4936]: I0930 15:06:01.780581 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kljrf" event={"ID":"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0","Type":"ContainerDied","Data":"08aaed2abf42edbc93f1633b3174930379ed5c15d7827c5159ecf31a45f1dfd2"} Sep 30 15:06:02 crc kubenswrapper[4936]: I0930 15:06:02.036715 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kljrf" Sep 30 15:06:02 crc kubenswrapper[4936]: I0930 15:06:02.092856 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0-catalog-content\") pod \"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0\" (UID: \"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0\") " Sep 30 15:06:02 crc kubenswrapper[4936]: I0930 15:06:02.092923 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7tw4\" (UniqueName: \"kubernetes.io/projected/5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0-kube-api-access-p7tw4\") pod \"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0\" (UID: \"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0\") " Sep 30 15:06:02 crc kubenswrapper[4936]: I0930 15:06:02.093104 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0-utilities\") pod \"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0\" (UID: \"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0\") " Sep 30 15:06:02 crc kubenswrapper[4936]: I0930 15:06:02.094412 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0-utilities" (OuterVolumeSpecName: "utilities") pod "5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0" (UID: "5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:06:02 crc kubenswrapper[4936]: I0930 15:06:02.105727 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0-kube-api-access-p7tw4" (OuterVolumeSpecName: "kube-api-access-p7tw4") pod "5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0" (UID: "5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0"). InnerVolumeSpecName "kube-api-access-p7tw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:06:02 crc kubenswrapper[4936]: I0930 15:06:02.180926 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0" (UID: "5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:06:02 crc kubenswrapper[4936]: I0930 15:06:02.195749 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 15:06:02 crc kubenswrapper[4936]: I0930 15:06:02.195780 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 15:06:02 crc kubenswrapper[4936]: I0930 15:06:02.195794 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7tw4\" (UniqueName: \"kubernetes.io/projected/5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0-kube-api-access-p7tw4\") on node \"crc\" DevicePath \"\"" Sep 30 15:06:02 crc kubenswrapper[4936]: I0930 15:06:02.794646 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kljrf" Sep 30 15:06:02 crc kubenswrapper[4936]: I0930 15:06:02.795071 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kljrf" event={"ID":"5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0","Type":"ContainerDied","Data":"8665774d385ff82dd7ff7e92fe50e34216907b89f3ef330212f2911605681621"} Sep 30 15:06:02 crc kubenswrapper[4936]: I0930 15:06:02.795100 4936 scope.go:117] "RemoveContainer" containerID="08aaed2abf42edbc93f1633b3174930379ed5c15d7827c5159ecf31a45f1dfd2" Sep 30 15:06:02 crc kubenswrapper[4936]: I0930 15:06:02.836390 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kljrf"] Sep 30 15:06:02 crc kubenswrapper[4936]: I0930 15:06:02.852689 4936 scope.go:117] "RemoveContainer" containerID="f4ab1057acf7e8f7f3dd93fff43106dcad31aab29148ee599f4d8bad67184170" Sep 30 15:06:02 crc kubenswrapper[4936]: I0930 15:06:02.855113 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kljrf"] Sep 30 15:06:02 crc kubenswrapper[4936]: I0930 15:06:02.883307 4936 scope.go:117] "RemoveContainer" containerID="90d9e4f63d26d1dc1d912a144aa3a2531c9f4917d7e9d55c5898a84dc85e35c5" Sep 30 15:06:04 crc kubenswrapper[4936]: I0930 15:06:04.325843 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0" path="/var/lib/kubelet/pods/5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0/volumes" Sep 30 15:06:07 crc kubenswrapper[4936]: I0930 15:06:07.315413 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:06:07 crc kubenswrapper[4936]: E0930 15:06:07.317054 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:06:21 crc kubenswrapper[4936]: I0930 15:06:21.315305 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:06:21 crc kubenswrapper[4936]: E0930 15:06:21.316025 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:06:36 crc kubenswrapper[4936]: I0930 15:06:36.318137 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:06:36 crc kubenswrapper[4936]: E0930 15:06:36.318905 4936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wj4sz_openshift-machine-config-operator(e09d215c-5c94-4b2a-bc68-c51a84b784a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" Sep 30 15:06:42 crc kubenswrapper[4936]: I0930 15:06:42.785071 4936 scope.go:117] "RemoveContainer" containerID="d6bae0cdd33ab14137a3613b4d01fadedcd9169d7d00646dbb9f918cf2141a42" Sep 30 15:06:50 crc kubenswrapper[4936]: I0930 15:06:50.322112 4936 scope.go:117] "RemoveContainer" containerID="d04c5c09629b1bd43e4e6b91e74b3e35ef6378d44e91dceece0b14c861e53cd9" Sep 30 15:06:51 crc kubenswrapper[4936]: I0930 15:06:51.190838 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" event={"ID":"e09d215c-5c94-4b2a-bc68-c51a84b784a7","Type":"ContainerStarted","Data":"91abd2e0fe2e7d995c8ca08d8685a03be71f88fe7e68c55de3eb8b4b9396979f"} Sep 30 15:07:08 crc kubenswrapper[4936]: I0930 15:07:08.340857 4936 generic.go:334] "Generic (PLEG): container finished" podID="7f46ad41-3b51-4de2-a9f4-f1200d6d85c5" containerID="362de18b3f46dcad8d48dc9efdf947737b0ea239d86a9f8ae8826327381b2c00" exitCode=0 Sep 30 15:07:08 crc kubenswrapper[4936]: I0930 15:07:08.341394 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-764s8/must-gather-5wk72" event={"ID":"7f46ad41-3b51-4de2-a9f4-f1200d6d85c5","Type":"ContainerDied","Data":"362de18b3f46dcad8d48dc9efdf947737b0ea239d86a9f8ae8826327381b2c00"} Sep 30 15:07:08 crc kubenswrapper[4936]: I0930 15:07:08.342650 4936 scope.go:117] "RemoveContainer" containerID="362de18b3f46dcad8d48dc9efdf947737b0ea239d86a9f8ae8826327381b2c00" Sep 30 15:07:08 crc kubenswrapper[4936]: I0930 15:07:08.490983 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-764s8_must-gather-5wk72_7f46ad41-3b51-4de2-a9f4-f1200d6d85c5/gather/0.log" Sep 30 15:07:23 crc kubenswrapper[4936]: I0930 15:07:23.147317 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-764s8/must-gather-5wk72"] Sep 30 15:07:23 crc kubenswrapper[4936]: I0930 15:07:23.147949 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-764s8/must-gather-5wk72" podUID="7f46ad41-3b51-4de2-a9f4-f1200d6d85c5" containerName="copy" containerID="cri-o://56fa05d4cb8d390a3fcb555f2593137a0807428d32b9e655267d3440a423bf1b" gracePeriod=2 Sep 30 15:07:23 crc kubenswrapper[4936]: I0930 15:07:23.165132 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-764s8/must-gather-5wk72"] Sep 30 15:07:23 crc kubenswrapper[4936]: I0930 15:07:23.496497 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-764s8_must-gather-5wk72_7f46ad41-3b51-4de2-a9f4-f1200d6d85c5/copy/0.log" Sep 30 15:07:23 crc kubenswrapper[4936]: I0930 15:07:23.498002 4936 generic.go:334] "Generic (PLEG): container finished" podID="7f46ad41-3b51-4de2-a9f4-f1200d6d85c5" containerID="56fa05d4cb8d390a3fcb555f2593137a0807428d32b9e655267d3440a423bf1b" exitCode=143 Sep 30 15:07:23 crc kubenswrapper[4936]: I0930 15:07:23.612938 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-764s8_must-gather-5wk72_7f46ad41-3b51-4de2-a9f4-f1200d6d85c5/copy/0.log" Sep 30 15:07:23 crc kubenswrapper[4936]: I0930 15:07:23.613560 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-764s8/must-gather-5wk72" Sep 30 15:07:23 crc kubenswrapper[4936]: I0930 15:07:23.790449 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k78w\" (UniqueName: \"kubernetes.io/projected/7f46ad41-3b51-4de2-a9f4-f1200d6d85c5-kube-api-access-7k78w\") pod \"7f46ad41-3b51-4de2-a9f4-f1200d6d85c5\" (UID: \"7f46ad41-3b51-4de2-a9f4-f1200d6d85c5\") " Sep 30 15:07:23 crc kubenswrapper[4936]: I0930 15:07:23.790668 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7f46ad41-3b51-4de2-a9f4-f1200d6d85c5-must-gather-output\") pod \"7f46ad41-3b51-4de2-a9f4-f1200d6d85c5\" (UID: \"7f46ad41-3b51-4de2-a9f4-f1200d6d85c5\") " Sep 30 15:07:23 crc kubenswrapper[4936]: I0930 15:07:23.810093 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f46ad41-3b51-4de2-a9f4-f1200d6d85c5-kube-api-access-7k78w" (OuterVolumeSpecName: "kube-api-access-7k78w") pod "7f46ad41-3b51-4de2-a9f4-f1200d6d85c5" (UID: "7f46ad41-3b51-4de2-a9f4-f1200d6d85c5"). InnerVolumeSpecName "kube-api-access-7k78w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:07:23 crc kubenswrapper[4936]: I0930 15:07:23.893259 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k78w\" (UniqueName: \"kubernetes.io/projected/7f46ad41-3b51-4de2-a9f4-f1200d6d85c5-kube-api-access-7k78w\") on node \"crc\" DevicePath \"\"" Sep 30 15:07:23 crc kubenswrapper[4936]: I0930 15:07:23.981363 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f46ad41-3b51-4de2-a9f4-f1200d6d85c5-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7f46ad41-3b51-4de2-a9f4-f1200d6d85c5" (UID: "7f46ad41-3b51-4de2-a9f4-f1200d6d85c5"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:07:23 crc kubenswrapper[4936]: I0930 15:07:23.994317 4936 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7f46ad41-3b51-4de2-a9f4-f1200d6d85c5-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 15:07:24 crc kubenswrapper[4936]: I0930 15:07:24.332024 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f46ad41-3b51-4de2-a9f4-f1200d6d85c5" path="/var/lib/kubelet/pods/7f46ad41-3b51-4de2-a9f4-f1200d6d85c5/volumes" Sep 30 15:07:24 crc kubenswrapper[4936]: I0930 15:07:24.509013 4936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-764s8_must-gather-5wk72_7f46ad41-3b51-4de2-a9f4-f1200d6d85c5/copy/0.log" Sep 30 15:07:24 crc kubenswrapper[4936]: I0930 15:07:24.509705 4936 scope.go:117] "RemoveContainer" containerID="56fa05d4cb8d390a3fcb555f2593137a0807428d32b9e655267d3440a423bf1b" Sep 30 15:07:24 crc kubenswrapper[4936]: I0930 15:07:24.509720 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-764s8/must-gather-5wk72" Sep 30 15:07:24 crc kubenswrapper[4936]: I0930 15:07:24.542722 4936 scope.go:117] "RemoveContainer" containerID="362de18b3f46dcad8d48dc9efdf947737b0ea239d86a9f8ae8826327381b2c00" Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.173970 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zvrjp"] Sep 30 15:08:28 crc kubenswrapper[4936]: E0930 15:08:28.174829 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0" containerName="extract-content" Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.174843 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0" containerName="extract-content" Sep 30 15:08:28 crc kubenswrapper[4936]: E0930 15:08:28.174858 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0" containerName="extract-utilities" Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.174865 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0" containerName="extract-utilities" Sep 30 15:08:28 crc kubenswrapper[4936]: E0930 15:08:28.174879 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f46ad41-3b51-4de2-a9f4-f1200d6d85c5" containerName="copy" Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.174885 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f46ad41-3b51-4de2-a9f4-f1200d6d85c5" containerName="copy" Sep 30 15:08:28 crc kubenswrapper[4936]: E0930 15:08:28.174895 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0" containerName="registry-server" Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.174901 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0" containerName="registry-server" Sep 30 15:08:28 crc kubenswrapper[4936]: E0930 15:08:28.174912 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f46ad41-3b51-4de2-a9f4-f1200d6d85c5" containerName="gather" Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.174918 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f46ad41-3b51-4de2-a9f4-f1200d6d85c5" containerName="gather" Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.175120 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c4a0dbd-4ec7-4144-8c67-19b0716d0ed0" containerName="registry-server" Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.175145 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f46ad41-3b51-4de2-a9f4-f1200d6d85c5" containerName="gather" Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.175155 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f46ad41-3b51-4de2-a9f4-f1200d6d85c5" containerName="copy" Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.182817 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvrjp" Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.186509 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvrjp"] Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.236381 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74bc000-0baf-4a27-b535-c4b73ff777a8-utilities\") pod \"redhat-operators-zvrjp\" (UID: \"b74bc000-0baf-4a27-b535-c4b73ff777a8\") " pod="openshift-marketplace/redhat-operators-zvrjp" Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.236557 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkwz4\" (UniqueName: \"kubernetes.io/projected/b74bc000-0baf-4a27-b535-c4b73ff777a8-kube-api-access-zkwz4\") pod \"redhat-operators-zvrjp\" (UID: \"b74bc000-0baf-4a27-b535-c4b73ff777a8\") " pod="openshift-marketplace/redhat-operators-zvrjp" Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.236586 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74bc000-0baf-4a27-b535-c4b73ff777a8-catalog-content\") pod \"redhat-operators-zvrjp\" (UID: \"b74bc000-0baf-4a27-b535-c4b73ff777a8\") " pod="openshift-marketplace/redhat-operators-zvrjp" Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.338807 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74bc000-0baf-4a27-b535-c4b73ff777a8-utilities\") pod \"redhat-operators-zvrjp\" (UID: \"b74bc000-0baf-4a27-b535-c4b73ff777a8\") " pod="openshift-marketplace/redhat-operators-zvrjp" Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.339654 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74bc000-0baf-4a27-b535-c4b73ff777a8-utilities\") pod \"redhat-operators-zvrjp\" (UID: \"b74bc000-0baf-4a27-b535-c4b73ff777a8\") " pod="openshift-marketplace/redhat-operators-zvrjp" Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.339611 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkwz4\" (UniqueName: \"kubernetes.io/projected/b74bc000-0baf-4a27-b535-c4b73ff777a8-kube-api-access-zkwz4\") pod \"redhat-operators-zvrjp\" (UID: \"b74bc000-0baf-4a27-b535-c4b73ff777a8\") " pod="openshift-marketplace/redhat-operators-zvrjp" Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.340916 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74bc000-0baf-4a27-b535-c4b73ff777a8-catalog-content\") pod \"redhat-operators-zvrjp\" (UID: \"b74bc000-0baf-4a27-b535-c4b73ff777a8\") " pod="openshift-marketplace/redhat-operators-zvrjp" Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.341399 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74bc000-0baf-4a27-b535-c4b73ff777a8-catalog-content\") pod \"redhat-operators-zvrjp\" (UID: \"b74bc000-0baf-4a27-b535-c4b73ff777a8\") " pod="openshift-marketplace/redhat-operators-zvrjp" Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.360137 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkwz4\" (UniqueName: \"kubernetes.io/projected/b74bc000-0baf-4a27-b535-c4b73ff777a8-kube-api-access-zkwz4\") pod \"redhat-operators-zvrjp\" (UID: \"b74bc000-0baf-4a27-b535-c4b73ff777a8\") " pod="openshift-marketplace/redhat-operators-zvrjp" Sep 30 15:08:28 crc kubenswrapper[4936]: I0930 15:08:28.536133 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvrjp" Sep 30 15:08:29 crc kubenswrapper[4936]: I0930 15:08:29.026785 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvrjp"] Sep 30 15:08:29 crc kubenswrapper[4936]: I0930 15:08:29.084603 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvrjp" event={"ID":"b74bc000-0baf-4a27-b535-c4b73ff777a8","Type":"ContainerStarted","Data":"464fb77d91203e5e44447686fc80c4daaa3b6acb658d84b9bf9d6e3ee6637122"} Sep 30 15:08:30 crc kubenswrapper[4936]: I0930 15:08:30.097077 4936 generic.go:334] "Generic (PLEG): container finished" podID="b74bc000-0baf-4a27-b535-c4b73ff777a8" containerID="9f66bbabdb2448660455ed688a7935f629a3f0686a8ebe62eb4cf104a5e95252" exitCode=0 Sep 30 15:08:30 crc kubenswrapper[4936]: I0930 15:08:30.097412 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvrjp" event={"ID":"b74bc000-0baf-4a27-b535-c4b73ff777a8","Type":"ContainerDied","Data":"9f66bbabdb2448660455ed688a7935f629a3f0686a8ebe62eb4cf104a5e95252"} Sep 30 15:08:31 crc kubenswrapper[4936]: I0930 15:08:31.108918 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvrjp" event={"ID":"b74bc000-0baf-4a27-b535-c4b73ff777a8","Type":"ContainerStarted","Data":"814bde367665f73b787ac7d4b322e7ace2b6f7dfbb9f0af6555c901fc95d6791"} Sep 30 15:08:35 crc kubenswrapper[4936]: I0930 15:08:35.147306 4936 generic.go:334] "Generic (PLEG): container finished" podID="b74bc000-0baf-4a27-b535-c4b73ff777a8" containerID="814bde367665f73b787ac7d4b322e7ace2b6f7dfbb9f0af6555c901fc95d6791" exitCode=0 Sep 30 15:08:35 crc kubenswrapper[4936]: I0930 15:08:35.147365 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvrjp" event={"ID":"b74bc000-0baf-4a27-b535-c4b73ff777a8","Type":"ContainerDied","Data":"814bde367665f73b787ac7d4b322e7ace2b6f7dfbb9f0af6555c901fc95d6791"} Sep 30 15:08:36 crc kubenswrapper[4936]: I0930 15:08:36.160562 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvrjp" event={"ID":"b74bc000-0baf-4a27-b535-c4b73ff777a8","Type":"ContainerStarted","Data":"02e5076ff378c761d59afe356d9a61fd136c804a91a602fcf0071805ef66a51e"} Sep 30 15:08:36 crc kubenswrapper[4936]: I0930 15:08:36.182942 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zvrjp" podStartSLOduration=2.559071371 podStartE2EDuration="8.182920742s" podCreationTimestamp="2025-09-30 15:08:28 +0000 UTC" firstStartedPulling="2025-09-30 15:08:30.099497222 +0000 UTC m=+5360.483499523" lastFinishedPulling="2025-09-30 15:08:35.723346593 +0000 UTC m=+5366.107348894" observedRunningTime="2025-09-30 15:08:36.176501445 +0000 UTC m=+5366.560503746" watchObservedRunningTime="2025-09-30 15:08:36.182920742 +0000 UTC m=+5366.566923043" Sep 30 15:08:38 crc kubenswrapper[4936]: I0930 15:08:38.536806 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zvrjp" Sep 30 15:08:38 crc kubenswrapper[4936]: I0930 15:08:38.537161 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zvrjp" Sep 30 15:08:39 crc kubenswrapper[4936]: I0930 15:08:39.592506 4936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zvrjp" podUID="b74bc000-0baf-4a27-b535-c4b73ff777a8" containerName="registry-server" probeResult="failure" output=< Sep 30 15:08:39 crc kubenswrapper[4936]: timeout: failed to connect service ":50051" within 1s Sep 30 15:08:39 crc kubenswrapper[4936]: > Sep 30 15:08:42 crc kubenswrapper[4936]: I0930 15:08:42.913301 4936 scope.go:117] "RemoveContainer" containerID="5d23944b200b5de33b637225e59ee8a3f659d86a644a1256c76317f968eee672" Sep 30 15:08:48 crc kubenswrapper[4936]: I0930 15:08:48.586903 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zvrjp" Sep 30 15:08:48 crc kubenswrapper[4936]: I0930 15:08:48.646952 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zvrjp" Sep 30 15:08:48 crc kubenswrapper[4936]: I0930 15:08:48.825593 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zvrjp"] Sep 30 15:08:50 crc kubenswrapper[4936]: I0930 15:08:50.264423 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zvrjp" podUID="b74bc000-0baf-4a27-b535-c4b73ff777a8" containerName="registry-server" containerID="cri-o://02e5076ff378c761d59afe356d9a61fd136c804a91a602fcf0071805ef66a51e" gracePeriod=2 Sep 30 15:08:50 crc kubenswrapper[4936]: I0930 15:08:50.747195 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvrjp" Sep 30 15:08:50 crc kubenswrapper[4936]: I0930 15:08:50.846991 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74bc000-0baf-4a27-b535-c4b73ff777a8-catalog-content\") pod \"b74bc000-0baf-4a27-b535-c4b73ff777a8\" (UID: \"b74bc000-0baf-4a27-b535-c4b73ff777a8\") " Sep 30 15:08:50 crc kubenswrapper[4936]: I0930 15:08:50.847294 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkwz4\" (UniqueName: \"kubernetes.io/projected/b74bc000-0baf-4a27-b535-c4b73ff777a8-kube-api-access-zkwz4\") pod \"b74bc000-0baf-4a27-b535-c4b73ff777a8\" (UID: \"b74bc000-0baf-4a27-b535-c4b73ff777a8\") " Sep 30 15:08:50 crc kubenswrapper[4936]: I0930 15:08:50.847461 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74bc000-0baf-4a27-b535-c4b73ff777a8-utilities\") pod \"b74bc000-0baf-4a27-b535-c4b73ff777a8\" (UID: \"b74bc000-0baf-4a27-b535-c4b73ff777a8\") " Sep 30 15:08:50 crc kubenswrapper[4936]: I0930 15:08:50.848747 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b74bc000-0baf-4a27-b535-c4b73ff777a8-utilities" (OuterVolumeSpecName: "utilities") pod "b74bc000-0baf-4a27-b535-c4b73ff777a8" (UID: "b74bc000-0baf-4a27-b535-c4b73ff777a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:08:50 crc kubenswrapper[4936]: I0930 15:08:50.870375 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b74bc000-0baf-4a27-b535-c4b73ff777a8-kube-api-access-zkwz4" (OuterVolumeSpecName: "kube-api-access-zkwz4") pod "b74bc000-0baf-4a27-b535-c4b73ff777a8" (UID: "b74bc000-0baf-4a27-b535-c4b73ff777a8"). InnerVolumeSpecName "kube-api-access-zkwz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:08:50 crc kubenswrapper[4936]: I0930 15:08:50.920673 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b74bc000-0baf-4a27-b535-c4b73ff777a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b74bc000-0baf-4a27-b535-c4b73ff777a8" (UID: "b74bc000-0baf-4a27-b535-c4b73ff777a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:08:50 crc kubenswrapper[4936]: I0930 15:08:50.950610 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74bc000-0baf-4a27-b535-c4b73ff777a8-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 15:08:50 crc kubenswrapper[4936]: I0930 15:08:50.950656 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkwz4\" (UniqueName: \"kubernetes.io/projected/b74bc000-0baf-4a27-b535-c4b73ff777a8-kube-api-access-zkwz4\") on node \"crc\" DevicePath \"\"" Sep 30 15:08:50 crc kubenswrapper[4936]: I0930 15:08:50.950669 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74bc000-0baf-4a27-b535-c4b73ff777a8-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 15:08:51 crc kubenswrapper[4936]: I0930 15:08:51.279653 4936 generic.go:334] "Generic (PLEG): container finished" podID="b74bc000-0baf-4a27-b535-c4b73ff777a8" containerID="02e5076ff378c761d59afe356d9a61fd136c804a91a602fcf0071805ef66a51e" exitCode=0 Sep 30 15:08:51 crc kubenswrapper[4936]: I0930 15:08:51.279697 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvrjp" event={"ID":"b74bc000-0baf-4a27-b535-c4b73ff777a8","Type":"ContainerDied","Data":"02e5076ff378c761d59afe356d9a61fd136c804a91a602fcf0071805ef66a51e"} Sep 30 15:08:51 crc kubenswrapper[4936]: I0930 15:08:51.279725 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvrjp" event={"ID":"b74bc000-0baf-4a27-b535-c4b73ff777a8","Type":"ContainerDied","Data":"464fb77d91203e5e44447686fc80c4daaa3b6acb658d84b9bf9d6e3ee6637122"} Sep 30 15:08:51 crc kubenswrapper[4936]: I0930 15:08:51.279742 4936 scope.go:117] "RemoveContainer" containerID="02e5076ff378c761d59afe356d9a61fd136c804a91a602fcf0071805ef66a51e" Sep 30 15:08:51 crc kubenswrapper[4936]: I0930 15:08:51.279869 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvrjp" Sep 30 15:08:51 crc kubenswrapper[4936]: I0930 15:08:51.313583 4936 scope.go:117] "RemoveContainer" containerID="814bde367665f73b787ac7d4b322e7ace2b6f7dfbb9f0af6555c901fc95d6791" Sep 30 15:08:51 crc kubenswrapper[4936]: I0930 15:08:51.322467 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zvrjp"] Sep 30 15:08:51 crc kubenswrapper[4936]: I0930 15:08:51.334384 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zvrjp"] Sep 30 15:08:51 crc kubenswrapper[4936]: I0930 15:08:51.339133 4936 scope.go:117] "RemoveContainer" containerID="9f66bbabdb2448660455ed688a7935f629a3f0686a8ebe62eb4cf104a5e95252" Sep 30 15:08:51 crc kubenswrapper[4936]: I0930 15:08:51.378007 4936 scope.go:117] "RemoveContainer" containerID="02e5076ff378c761d59afe356d9a61fd136c804a91a602fcf0071805ef66a51e" Sep 30 15:08:51 crc kubenswrapper[4936]: E0930 15:08:51.378999 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02e5076ff378c761d59afe356d9a61fd136c804a91a602fcf0071805ef66a51e\": container with ID starting with 02e5076ff378c761d59afe356d9a61fd136c804a91a602fcf0071805ef66a51e not found: ID does not exist" containerID="02e5076ff378c761d59afe356d9a61fd136c804a91a602fcf0071805ef66a51e" Sep 30 15:08:51 crc kubenswrapper[4936]: I0930 15:08:51.379233 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02e5076ff378c761d59afe356d9a61fd136c804a91a602fcf0071805ef66a51e"} err="failed to get container status \"02e5076ff378c761d59afe356d9a61fd136c804a91a602fcf0071805ef66a51e\": rpc error: code = NotFound desc = could not find container \"02e5076ff378c761d59afe356d9a61fd136c804a91a602fcf0071805ef66a51e\": container with ID starting with 02e5076ff378c761d59afe356d9a61fd136c804a91a602fcf0071805ef66a51e not found: ID does not exist" Sep 30 15:08:51 crc kubenswrapper[4936]: I0930 15:08:51.379346 4936 scope.go:117] "RemoveContainer" containerID="814bde367665f73b787ac7d4b322e7ace2b6f7dfbb9f0af6555c901fc95d6791" Sep 30 15:08:51 crc kubenswrapper[4936]: E0930 15:08:51.379822 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"814bde367665f73b787ac7d4b322e7ace2b6f7dfbb9f0af6555c901fc95d6791\": container with ID starting with 814bde367665f73b787ac7d4b322e7ace2b6f7dfbb9f0af6555c901fc95d6791 not found: ID does not exist" containerID="814bde367665f73b787ac7d4b322e7ace2b6f7dfbb9f0af6555c901fc95d6791" Sep 30 15:08:51 crc kubenswrapper[4936]: I0930 15:08:51.379870 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"814bde367665f73b787ac7d4b322e7ace2b6f7dfbb9f0af6555c901fc95d6791"} err="failed to get container status \"814bde367665f73b787ac7d4b322e7ace2b6f7dfbb9f0af6555c901fc95d6791\": rpc error: code = NotFound desc = could not find container \"814bde367665f73b787ac7d4b322e7ace2b6f7dfbb9f0af6555c901fc95d6791\": container with ID starting with 814bde367665f73b787ac7d4b322e7ace2b6f7dfbb9f0af6555c901fc95d6791 not found: ID does not exist" Sep 30 15:08:51 crc kubenswrapper[4936]: I0930 15:08:51.379903 4936 scope.go:117] "RemoveContainer" containerID="9f66bbabdb2448660455ed688a7935f629a3f0686a8ebe62eb4cf104a5e95252" Sep 30 15:08:51 crc kubenswrapper[4936]: E0930 15:08:51.380287 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f66bbabdb2448660455ed688a7935f629a3f0686a8ebe62eb4cf104a5e95252\": container with ID starting with 9f66bbabdb2448660455ed688a7935f629a3f0686a8ebe62eb4cf104a5e95252 not found: ID does not exist" containerID="9f66bbabdb2448660455ed688a7935f629a3f0686a8ebe62eb4cf104a5e95252" Sep 30 15:08:51 crc kubenswrapper[4936]: I0930 15:08:51.380383 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f66bbabdb2448660455ed688a7935f629a3f0686a8ebe62eb4cf104a5e95252"} err="failed to get container status \"9f66bbabdb2448660455ed688a7935f629a3f0686a8ebe62eb4cf104a5e95252\": rpc error: code = NotFound desc = could not find container \"9f66bbabdb2448660455ed688a7935f629a3f0686a8ebe62eb4cf104a5e95252\": container with ID starting with 9f66bbabdb2448660455ed688a7935f629a3f0686a8ebe62eb4cf104a5e95252 not found: ID does not exist" Sep 30 15:08:52 crc kubenswrapper[4936]: I0930 15:08:52.327125 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b74bc000-0baf-4a27-b535-c4b73ff777a8" path="/var/lib/kubelet/pods/b74bc000-0baf-4a27-b535-c4b73ff777a8/volumes" Sep 30 15:09:18 crc kubenswrapper[4936]: I0930 15:09:18.250329 4936 patch_prober.go:28] interesting pod/machine-config-daemon-wj4sz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:09:18 crc kubenswrapper[4936]: I0930 15:09:18.250884 4936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wj4sz" podUID="e09d215c-5c94-4b2a-bc68-c51a84b784a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:09:26 crc kubenswrapper[4936]: I0930 15:09:26.300372 4936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w5vxn"] Sep 30 15:09:26 crc kubenswrapper[4936]: E0930 15:09:26.302393 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74bc000-0baf-4a27-b535-c4b73ff777a8" containerName="extract-utilities" Sep 30 15:09:26 crc kubenswrapper[4936]: I0930 15:09:26.302530 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74bc000-0baf-4a27-b535-c4b73ff777a8" containerName="extract-utilities" Sep 30 15:09:26 crc kubenswrapper[4936]: E0930 15:09:26.302636 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74bc000-0baf-4a27-b535-c4b73ff777a8" containerName="registry-server" Sep 30 15:09:26 crc kubenswrapper[4936]: I0930 15:09:26.302720 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74bc000-0baf-4a27-b535-c4b73ff777a8" containerName="registry-server" Sep 30 15:09:26 crc kubenswrapper[4936]: E0930 15:09:26.302835 4936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74bc000-0baf-4a27-b535-c4b73ff777a8" containerName="extract-content" Sep 30 15:09:26 crc kubenswrapper[4936]: I0930 15:09:26.302924 4936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74bc000-0baf-4a27-b535-c4b73ff777a8" containerName="extract-content" Sep 30 15:09:26 crc kubenswrapper[4936]: I0930 15:09:26.303252 4936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b74bc000-0baf-4a27-b535-c4b73ff777a8" containerName="registry-server" Sep 30 15:09:26 crc kubenswrapper[4936]: I0930 15:09:26.304948 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w5vxn" Sep 30 15:09:26 crc kubenswrapper[4936]: I0930 15:09:26.348598 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w5vxn"] Sep 30 15:09:26 crc kubenswrapper[4936]: I0930 15:09:26.366632 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c948cfe-5094-40ba-9246-b054574e5dba-utilities\") pod \"redhat-marketplace-w5vxn\" (UID: \"6c948cfe-5094-40ba-9246-b054574e5dba\") " pod="openshift-marketplace/redhat-marketplace-w5vxn" Sep 30 15:09:26 crc kubenswrapper[4936]: I0930 15:09:26.366716 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv55j\" (UniqueName: \"kubernetes.io/projected/6c948cfe-5094-40ba-9246-b054574e5dba-kube-api-access-gv55j\") pod \"redhat-marketplace-w5vxn\" (UID: \"6c948cfe-5094-40ba-9246-b054574e5dba\") " pod="openshift-marketplace/redhat-marketplace-w5vxn" Sep 30 15:09:26 crc kubenswrapper[4936]: I0930 15:09:26.366805 4936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c948cfe-5094-40ba-9246-b054574e5dba-catalog-content\") pod \"redhat-marketplace-w5vxn\" (UID: \"6c948cfe-5094-40ba-9246-b054574e5dba\") " pod="openshift-marketplace/redhat-marketplace-w5vxn" Sep 30 15:09:26 crc kubenswrapper[4936]: I0930 15:09:26.468524 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c948cfe-5094-40ba-9246-b054574e5dba-catalog-content\") pod \"redhat-marketplace-w5vxn\" (UID: \"6c948cfe-5094-40ba-9246-b054574e5dba\") " pod="openshift-marketplace/redhat-marketplace-w5vxn" Sep 30 15:09:26 crc kubenswrapper[4936]: I0930 15:09:26.468740 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c948cfe-5094-40ba-9246-b054574e5dba-utilities\") pod \"redhat-marketplace-w5vxn\" (UID: \"6c948cfe-5094-40ba-9246-b054574e5dba\") " pod="openshift-marketplace/redhat-marketplace-w5vxn" Sep 30 15:09:26 crc kubenswrapper[4936]: I0930 15:09:26.468814 4936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv55j\" (UniqueName: \"kubernetes.io/projected/6c948cfe-5094-40ba-9246-b054574e5dba-kube-api-access-gv55j\") pod \"redhat-marketplace-w5vxn\" (UID: \"6c948cfe-5094-40ba-9246-b054574e5dba\") " pod="openshift-marketplace/redhat-marketplace-w5vxn" Sep 30 15:09:26 crc kubenswrapper[4936]: I0930 15:09:26.469029 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c948cfe-5094-40ba-9246-b054574e5dba-catalog-content\") pod \"redhat-marketplace-w5vxn\" (UID: \"6c948cfe-5094-40ba-9246-b054574e5dba\") " pod="openshift-marketplace/redhat-marketplace-w5vxn" Sep 30 15:09:26 crc kubenswrapper[4936]: I0930 15:09:26.469068 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c948cfe-5094-40ba-9246-b054574e5dba-utilities\") pod \"redhat-marketplace-w5vxn\" (UID: \"6c948cfe-5094-40ba-9246-b054574e5dba\") " pod="openshift-marketplace/redhat-marketplace-w5vxn" Sep 30 15:09:26 crc kubenswrapper[4936]: I0930 15:09:26.496066 4936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv55j\" (UniqueName: \"kubernetes.io/projected/6c948cfe-5094-40ba-9246-b054574e5dba-kube-api-access-gv55j\") pod \"redhat-marketplace-w5vxn\" (UID: \"6c948cfe-5094-40ba-9246-b054574e5dba\") " pod="openshift-marketplace/redhat-marketplace-w5vxn" Sep 30 15:09:26 crc kubenswrapper[4936]: I0930 15:09:26.635762 4936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w5vxn" Sep 30 15:09:27 crc kubenswrapper[4936]: I0930 15:09:27.090971 4936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w5vxn"] Sep 30 15:09:27 crc kubenswrapper[4936]: I0930 15:09:27.602192 4936 generic.go:334] "Generic (PLEG): container finished" podID="6c948cfe-5094-40ba-9246-b054574e5dba" containerID="25af7dd9336671e25c6387cb879e7aace4e9f29030fa9398bc34b6fabb8a455e" exitCode=0 Sep 30 15:09:27 crc kubenswrapper[4936]: I0930 15:09:27.602255 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w5vxn" event={"ID":"6c948cfe-5094-40ba-9246-b054574e5dba","Type":"ContainerDied","Data":"25af7dd9336671e25c6387cb879e7aace4e9f29030fa9398bc34b6fabb8a455e"} Sep 30 15:09:27 crc kubenswrapper[4936]: I0930 15:09:27.602733 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w5vxn" event={"ID":"6c948cfe-5094-40ba-9246-b054574e5dba","Type":"ContainerStarted","Data":"7ba32ba0873d2cf45113d10e70d0ee3fc8a0c5f9e91aa57a807b86380f044f24"} Sep 30 15:09:29 crc kubenswrapper[4936]: I0930 15:09:29.636632 4936 generic.go:334] "Generic (PLEG): container finished" podID="6c948cfe-5094-40ba-9246-b054574e5dba" containerID="4b36b905c902b076e96e658a95d45d09d30d97f84a8eb00c2245d5168e627d9f" exitCode=0 Sep 30 15:09:29 crc kubenswrapper[4936]: I0930 15:09:29.636847 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w5vxn" event={"ID":"6c948cfe-5094-40ba-9246-b054574e5dba","Type":"ContainerDied","Data":"4b36b905c902b076e96e658a95d45d09d30d97f84a8eb00c2245d5168e627d9f"} Sep 30 15:09:31 crc kubenswrapper[4936]: I0930 15:09:31.657484 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w5vxn" event={"ID":"6c948cfe-5094-40ba-9246-b054574e5dba","Type":"ContainerStarted","Data":"b86e1c16edc6e88b83af088212a3d12ee9adce8515b958eccf77c46ba754da91"} Sep 30 15:09:31 crc kubenswrapper[4936]: I0930 15:09:31.677815 4936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w5vxn" podStartSLOduration=2.783138938 podStartE2EDuration="5.677799316s" podCreationTimestamp="2025-09-30 15:09:26 +0000 UTC" firstStartedPulling="2025-09-30 15:09:27.60845088 +0000 UTC m=+5417.992453181" lastFinishedPulling="2025-09-30 15:09:30.503111258 +0000 UTC m=+5420.887113559" observedRunningTime="2025-09-30 15:09:31.674357751 +0000 UTC m=+5422.058360052" watchObservedRunningTime="2025-09-30 15:09:31.677799316 +0000 UTC m=+5422.061801617" Sep 30 15:09:36 crc kubenswrapper[4936]: I0930 15:09:36.636227 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w5vxn" Sep 30 15:09:36 crc kubenswrapper[4936]: I0930 15:09:36.637425 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w5vxn" Sep 30 15:09:36 crc kubenswrapper[4936]: I0930 15:09:36.684571 4936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w5vxn" Sep 30 15:09:36 crc kubenswrapper[4936]: I0930 15:09:36.777214 4936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w5vxn" Sep 30 15:09:36 crc kubenswrapper[4936]: I0930 15:09:36.916201 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w5vxn"] Sep 30 15:09:38 crc kubenswrapper[4936]: I0930 15:09:38.737733 4936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w5vxn" podUID="6c948cfe-5094-40ba-9246-b054574e5dba" containerName="registry-server" containerID="cri-o://b86e1c16edc6e88b83af088212a3d12ee9adce8515b958eccf77c46ba754da91" gracePeriod=2 Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.256426 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w5vxn" Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.382076 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv55j\" (UniqueName: \"kubernetes.io/projected/6c948cfe-5094-40ba-9246-b054574e5dba-kube-api-access-gv55j\") pod \"6c948cfe-5094-40ba-9246-b054574e5dba\" (UID: \"6c948cfe-5094-40ba-9246-b054574e5dba\") " Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.382137 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c948cfe-5094-40ba-9246-b054574e5dba-utilities\") pod \"6c948cfe-5094-40ba-9246-b054574e5dba\" (UID: \"6c948cfe-5094-40ba-9246-b054574e5dba\") " Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.382176 4936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c948cfe-5094-40ba-9246-b054574e5dba-catalog-content\") pod \"6c948cfe-5094-40ba-9246-b054574e5dba\" (UID: \"6c948cfe-5094-40ba-9246-b054574e5dba\") " Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.383176 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c948cfe-5094-40ba-9246-b054574e5dba-utilities" (OuterVolumeSpecName: "utilities") pod "6c948cfe-5094-40ba-9246-b054574e5dba" (UID: "6c948cfe-5094-40ba-9246-b054574e5dba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.388101 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c948cfe-5094-40ba-9246-b054574e5dba-kube-api-access-gv55j" (OuterVolumeSpecName: "kube-api-access-gv55j") pod "6c948cfe-5094-40ba-9246-b054574e5dba" (UID: "6c948cfe-5094-40ba-9246-b054574e5dba"). InnerVolumeSpecName "kube-api-access-gv55j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.404031 4936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c948cfe-5094-40ba-9246-b054574e5dba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c948cfe-5094-40ba-9246-b054574e5dba" (UID: "6c948cfe-5094-40ba-9246-b054574e5dba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.484894 4936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c948cfe-5094-40ba-9246-b054574e5dba-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.484938 4936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gv55j\" (UniqueName: \"kubernetes.io/projected/6c948cfe-5094-40ba-9246-b054574e5dba-kube-api-access-gv55j\") on node \"crc\" DevicePath \"\"" Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.484950 4936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c948cfe-5094-40ba-9246-b054574e5dba-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.752752 4936 generic.go:334] "Generic (PLEG): container finished" podID="6c948cfe-5094-40ba-9246-b054574e5dba" containerID="b86e1c16edc6e88b83af088212a3d12ee9adce8515b958eccf77c46ba754da91" exitCode=0 Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.752807 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w5vxn" event={"ID":"6c948cfe-5094-40ba-9246-b054574e5dba","Type":"ContainerDied","Data":"b86e1c16edc6e88b83af088212a3d12ee9adce8515b958eccf77c46ba754da91"} Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.752821 4936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w5vxn" Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.752839 4936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w5vxn" event={"ID":"6c948cfe-5094-40ba-9246-b054574e5dba","Type":"ContainerDied","Data":"7ba32ba0873d2cf45113d10e70d0ee3fc8a0c5f9e91aa57a807b86380f044f24"} Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.752861 4936 scope.go:117] "RemoveContainer" containerID="b86e1c16edc6e88b83af088212a3d12ee9adce8515b958eccf77c46ba754da91" Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.789376 4936 scope.go:117] "RemoveContainer" containerID="4b36b905c902b076e96e658a95d45d09d30d97f84a8eb00c2245d5168e627d9f" Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.804620 4936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w5vxn"] Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.817507 4936 scope.go:117] "RemoveContainer" containerID="25af7dd9336671e25c6387cb879e7aace4e9f29030fa9398bc34b6fabb8a455e" Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.818274 4936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w5vxn"] Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.877137 4936 scope.go:117] "RemoveContainer" containerID="b86e1c16edc6e88b83af088212a3d12ee9adce8515b958eccf77c46ba754da91" Sep 30 15:09:39 crc kubenswrapper[4936]: E0930 15:09:39.878137 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b86e1c16edc6e88b83af088212a3d12ee9adce8515b958eccf77c46ba754da91\": container with ID starting with b86e1c16edc6e88b83af088212a3d12ee9adce8515b958eccf77c46ba754da91 not found: ID does not exist" containerID="b86e1c16edc6e88b83af088212a3d12ee9adce8515b958eccf77c46ba754da91" Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.878244 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b86e1c16edc6e88b83af088212a3d12ee9adce8515b958eccf77c46ba754da91"} err="failed to get container status \"b86e1c16edc6e88b83af088212a3d12ee9adce8515b958eccf77c46ba754da91\": rpc error: code = NotFound desc = could not find container \"b86e1c16edc6e88b83af088212a3d12ee9adce8515b958eccf77c46ba754da91\": container with ID starting with b86e1c16edc6e88b83af088212a3d12ee9adce8515b958eccf77c46ba754da91 not found: ID does not exist" Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.878383 4936 scope.go:117] "RemoveContainer" containerID="4b36b905c902b076e96e658a95d45d09d30d97f84a8eb00c2245d5168e627d9f" Sep 30 15:09:39 crc kubenswrapper[4936]: E0930 15:09:39.878873 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b36b905c902b076e96e658a95d45d09d30d97f84a8eb00c2245d5168e627d9f\": container with ID starting with 4b36b905c902b076e96e658a95d45d09d30d97f84a8eb00c2245d5168e627d9f not found: ID does not exist" containerID="4b36b905c902b076e96e658a95d45d09d30d97f84a8eb00c2245d5168e627d9f" Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.878898 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b36b905c902b076e96e658a95d45d09d30d97f84a8eb00c2245d5168e627d9f"} err="failed to get container status \"4b36b905c902b076e96e658a95d45d09d30d97f84a8eb00c2245d5168e627d9f\": rpc error: code = NotFound desc = could not find container \"4b36b905c902b076e96e658a95d45d09d30d97f84a8eb00c2245d5168e627d9f\": container with ID starting with 4b36b905c902b076e96e658a95d45d09d30d97f84a8eb00c2245d5168e627d9f not found: ID does not exist" Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.878913 4936 scope.go:117] "RemoveContainer" containerID="25af7dd9336671e25c6387cb879e7aace4e9f29030fa9398bc34b6fabb8a455e" Sep 30 15:09:39 crc kubenswrapper[4936]: E0930 15:09:39.879195 4936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25af7dd9336671e25c6387cb879e7aace4e9f29030fa9398bc34b6fabb8a455e\": container with ID starting with 25af7dd9336671e25c6387cb879e7aace4e9f29030fa9398bc34b6fabb8a455e not found: ID does not exist" containerID="25af7dd9336671e25c6387cb879e7aace4e9f29030fa9398bc34b6fabb8a455e" Sep 30 15:09:39 crc kubenswrapper[4936]: I0930 15:09:39.879236 4936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25af7dd9336671e25c6387cb879e7aace4e9f29030fa9398bc34b6fabb8a455e"} err="failed to get container status \"25af7dd9336671e25c6387cb879e7aace4e9f29030fa9398bc34b6fabb8a455e\": rpc error: code = NotFound desc = could not find container \"25af7dd9336671e25c6387cb879e7aace4e9f29030fa9398bc34b6fabb8a455e\": container with ID starting with 25af7dd9336671e25c6387cb879e7aace4e9f29030fa9398bc34b6fabb8a455e not found: ID does not exist" Sep 30 15:09:40 crc kubenswrapper[4936]: I0930 15:09:40.340261 4936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c948cfe-5094-40ba-9246-b054574e5dba" path="/var/lib/kubelet/pods/6c948cfe-5094-40ba-9246-b054574e5dba/volumes"